Mar 10 15:06:05 crc systemd[1]: Starting Kubernetes Kubelet... Mar 10 15:06:05 crc restorecon[4695]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:05 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 10 15:06:06 crc restorecon[4695]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 10 15:06:07 crc kubenswrapper[4795]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.256526 4795 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265415 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265439 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265445 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265449 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265453 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265459 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265463 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265467 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265471 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265475 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265479 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265483 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265488 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265492 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265496 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265501 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265505 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265509 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265513 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265517 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265522 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265525 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265529 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265533 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265537 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265542 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265546 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265549 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265553 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265557 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265560 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265564 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265569 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265574 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265578 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265582 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265586 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265590 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265594 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265597 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265600 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265607 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265612 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265617 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265621 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265625 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265629 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265633 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265638 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265642 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265646 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265650 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265654 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265658 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265661 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265665 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265669 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265673 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265678 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265683 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265689 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265693 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265697 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265702 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265706 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265710 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265714 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265718 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265722 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265726 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.265730 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265854 4795 flags.go:64] FLAG: --address="0.0.0.0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265866 4795 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265877 4795 flags.go:64] FLAG: --anonymous-auth="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265885 4795 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265891 4795 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265896 4795 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265903 4795 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265910 4795 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265915 4795 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265922 4795 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265928 4795 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265935 4795 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265940 4795 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265944 4795 flags.go:64] FLAG: --cgroup-root="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265949 4795 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265954 4795 flags.go:64] FLAG: --client-ca-file="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265959 4795 flags.go:64] FLAG: --cloud-config="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265964 4795 flags.go:64] FLAG: --cloud-provider="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265969 4795 flags.go:64] FLAG: --cluster-dns="[]" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265975 4795 flags.go:64] FLAG: --cluster-domain="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265980 4795 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265985 4795 flags.go:64] FLAG: --config-dir="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265989 4795 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.265995 4795 flags.go:64] FLAG: --container-log-max-files="5" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266002 4795 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266007 4795 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266012 4795 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266017 4795 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266022 4795 flags.go:64] FLAG: --contention-profiling="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266026 4795 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266031 4795 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266036 4795 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266041 4795 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266047 4795 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266051 4795 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266056 4795 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266060 4795 flags.go:64] FLAG: --enable-load-reader="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266082 4795 flags.go:64] FLAG: --enable-server="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266088 4795 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266094 4795 flags.go:64] FLAG: --event-burst="100" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266099 4795 flags.go:64] FLAG: --event-qps="50" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266104 4795 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266108 4795 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266113 4795 flags.go:64] FLAG: --eviction-hard="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266118 4795 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266124 4795 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266129 4795 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266134 4795 flags.go:64] FLAG: --eviction-soft="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266139 4795 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266143 4795 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266147 4795 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266152 4795 flags.go:64] FLAG: --experimental-mounter-path="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266156 4795 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266161 4795 flags.go:64] FLAG: --fail-swap-on="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266166 4795 flags.go:64] FLAG: --feature-gates="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266171 4795 flags.go:64] FLAG: --file-check-frequency="20s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266176 4795 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266180 4795 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266185 4795 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266190 4795 flags.go:64] FLAG: --healthz-port="10248" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266195 4795 flags.go:64] FLAG: --help="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266199 4795 flags.go:64] FLAG: --hostname-override="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266204 4795 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266208 4795 flags.go:64] FLAG: --http-check-frequency="20s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266213 4795 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266218 4795 flags.go:64] FLAG: --image-credential-provider-config="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266222 4795 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266226 4795 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266230 4795 flags.go:64] FLAG: --image-service-endpoint="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266235 4795 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266239 4795 flags.go:64] FLAG: --kube-api-burst="100" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266244 4795 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266248 4795 flags.go:64] FLAG: --kube-api-qps="50" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266253 4795 flags.go:64] FLAG: --kube-reserved="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266257 4795 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266262 4795 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266266 4795 flags.go:64] FLAG: --kubelet-cgroups="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266271 4795 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266275 4795 flags.go:64] FLAG: --lock-file="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266279 4795 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266284 4795 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266289 4795 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266296 4795 flags.go:64] FLAG: --log-json-split-stream="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266300 4795 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266305 4795 flags.go:64] FLAG: --log-text-split-stream="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266309 4795 flags.go:64] FLAG: --logging-format="text" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266314 4795 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266318 4795 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266323 4795 flags.go:64] FLAG: --manifest-url="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266327 4795 flags.go:64] FLAG: --manifest-url-header="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266333 4795 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266338 4795 flags.go:64] FLAG: --max-open-files="1000000" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266343 4795 flags.go:64] FLAG: --max-pods="110" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266347 4795 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266352 4795 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266356 4795 flags.go:64] FLAG: --memory-manager-policy="None" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266361 4795 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266366 4795 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266370 4795 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266376 4795 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266395 4795 flags.go:64] FLAG: --node-status-max-images="50" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266400 4795 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266404 4795 flags.go:64] FLAG: --oom-score-adj="-999" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266409 4795 flags.go:64] FLAG: --pod-cidr="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266414 4795 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266421 4795 flags.go:64] FLAG: --pod-manifest-path="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266426 4795 flags.go:64] FLAG: --pod-max-pids="-1" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266431 4795 flags.go:64] FLAG: --pods-per-core="0" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266435 4795 flags.go:64] FLAG: --port="10250" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266440 4795 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266444 4795 flags.go:64] FLAG: --provider-id="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266449 4795 flags.go:64] FLAG: --qos-reserved="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266453 4795 flags.go:64] FLAG: --read-only-port="10255" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266457 4795 flags.go:64] FLAG: --register-node="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266461 4795 flags.go:64] FLAG: --register-schedulable="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266466 4795 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266474 4795 flags.go:64] FLAG: --registry-burst="10" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266477 4795 flags.go:64] FLAG: --registry-qps="5" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266482 4795 flags.go:64] FLAG: --reserved-cpus="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266487 4795 flags.go:64] FLAG: --reserved-memory="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266493 4795 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266498 4795 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266502 4795 flags.go:64] FLAG: --rotate-certificates="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266507 4795 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266512 4795 flags.go:64] FLAG: --runonce="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266516 4795 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266521 4795 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266526 4795 flags.go:64] FLAG: --seccomp-default="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266530 4795 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266535 4795 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266539 4795 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266544 4795 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266549 4795 flags.go:64] FLAG: --storage-driver-password="root" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266553 4795 flags.go:64] FLAG: --storage-driver-secure="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266558 4795 flags.go:64] FLAG: --storage-driver-table="stats" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266562 4795 flags.go:64] FLAG: --storage-driver-user="root" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266567 4795 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266572 4795 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266588 4795 flags.go:64] FLAG: --system-cgroups="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266593 4795 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266600 4795 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266604 4795 flags.go:64] FLAG: --tls-cert-file="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266609 4795 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266615 4795 flags.go:64] FLAG: --tls-min-version="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266620 4795 flags.go:64] FLAG: --tls-private-key-file="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266624 4795 flags.go:64] FLAG: --topology-manager-policy="none" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266628 4795 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266633 4795 flags.go:64] FLAG: --topology-manager-scope="container" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266637 4795 flags.go:64] FLAG: --v="2" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266643 4795 flags.go:64] FLAG: --version="false" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266648 4795 flags.go:64] FLAG: --vmodule="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266654 4795 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.266658 4795 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266759 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266764 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266770 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266774 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266777 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266781 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266785 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266788 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266792 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266795 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266799 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266802 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266806 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266810 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266813 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266817 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266820 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266824 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266827 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266831 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266834 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266838 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266841 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266845 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266848 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266852 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266856 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266860 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266866 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266872 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266876 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266881 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266885 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266890 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266894 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266900 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266906 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266911 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266927 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266932 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266937 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266943 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266947 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266952 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266956 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266960 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266964 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266968 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266972 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266976 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266980 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266983 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266987 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266990 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266994 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.266997 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267001 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267005 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267008 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267011 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267015 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267018 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267022 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267025 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267029 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267032 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267036 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267039 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267043 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267046 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.267051 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.267063 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.277927 4795 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.277956 4795 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278038 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278046 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278054 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278075 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278081 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278087 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278093 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278098 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278104 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278109 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278115 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278120 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278126 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278132 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278138 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278143 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278149 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278155 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278160 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278166 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278171 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278176 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278181 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278186 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278192 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278197 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278202 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278209 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278216 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278224 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278230 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278237 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278243 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278249 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278256 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278261 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278266 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278272 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278277 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278282 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278288 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278293 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278298 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278303 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278309 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278314 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278319 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278325 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278330 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278336 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278341 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278346 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278352 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278357 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278363 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278368 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278373 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278379 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278384 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278389 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278394 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278400 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278405 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278412 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278419 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278425 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278430 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278436 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278442 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278449 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278457 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.278465 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278615 4795 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278623 4795 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278630 4795 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278636 4795 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278641 4795 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278646 4795 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278651 4795 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278657 4795 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278662 4795 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278668 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278673 4795 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278679 4795 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278684 4795 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278690 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278695 4795 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278701 4795 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278706 4795 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278712 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278717 4795 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278723 4795 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278728 4795 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278733 4795 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278739 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278744 4795 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278750 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278755 4795 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278760 4795 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278765 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278771 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278776 4795 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278781 4795 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278786 4795 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278792 4795 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278797 4795 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278803 4795 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278808 4795 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278813 4795 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278818 4795 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278823 4795 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278829 4795 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278834 4795 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278840 4795 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278846 4795 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278851 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278858 4795 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278865 4795 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278872 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278877 4795 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278883 4795 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278888 4795 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278894 4795 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278900 4795 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278907 4795 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278913 4795 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278919 4795 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278924 4795 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278930 4795 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278935 4795 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278941 4795 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278946 4795 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278951 4795 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278957 4795 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278963 4795 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278970 4795 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278976 4795 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278982 4795 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278988 4795 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278993 4795 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.278998 4795 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.279003 4795 feature_gate.go:330] unrecognized feature gate: Example Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.279010 4795 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.279019 4795 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.279188 4795 server.go:940] "Client rotation is on, will bootstrap in background" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.283542 4795 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.286448 4795 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.286533 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.288101 4795 server.go:997] "Starting client certificate rotation" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.288121 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.288247 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.310819 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.312979 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.313980 4795 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.326450 4795 log.go:25] "Validated CRI v1 runtime API" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.366661 4795 log.go:25] "Validated CRI v1 image API" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.368720 4795 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.375561 4795 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-10-15-00-20-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.375591 4795 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.391794 4795 manager.go:217] Machine: {Timestamp:2026-03-10 15:06:07.386435268 +0000 UTC m=+0.552176186 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:99de62c4-4c93-4a3b-bef3-57b8bbfec858 BootID:2b8471d8-f6d6-4351-8edc-ecce171cc356 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:6c:62:bf Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:6c:62:bf Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:9b:c4:a7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ba:39:ed Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9e:3c:9f Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a6:53:03 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:fe:c4:8e:56:cb:84 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:86:b9:bd:67:15:fc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392103 4795 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392275 4795 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392546 4795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392707 4795 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392747 4795 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392929 4795 topology_manager.go:138] "Creating topology manager with none policy" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.392938 4795 container_manager_linux.go:303] "Creating device plugin manager" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.393758 4795 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.393785 4795 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.393934 4795 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.394005 4795 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.398856 4795 kubelet.go:418] "Attempting to sync node with API server" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.398880 4795 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.398905 4795 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.398920 4795 kubelet.go:324] "Adding apiserver pod source" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.398966 4795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.404298 4795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.405668 4795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.407979 4795 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.408895 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.409006 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.409047 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.409105 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.409937 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.409966 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.409975 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.409985 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410000 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410009 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410018 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410033 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410045 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410056 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410085 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.410112 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.411693 4795 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.412035 4795 server.go:1280] "Started kubelet" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.412245 4795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.412402 4795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.413397 4795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 10 15:06:07 crc systemd[1]: Started Kubernetes Kubelet. Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.414335 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.414380 4795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.419861 4795 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.419898 4795 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.420271 4795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.420289 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.421650 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.423969 4795 factory.go:55] Registering systemd factory Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.423993 4795 factory.go:221] Registration of the systemd container factory successfully Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.424091 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.424904 4795 factory.go:153] Registering CRI-O factory Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.424918 4795 factory.go:221] Registration of the crio container factory successfully Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.424984 4795 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.425005 4795 factory.go:103] Registering Raw factory Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.424127 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.425244 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.424020 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.426981 4795 manager.go:1196] Started watching for new ooms in manager Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.427712 4795 manager.go:319] Starting recovery of all containers Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.427773 4795 server.go:460] "Adding debug handlers to kubelet server" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434520 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434577 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434599 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434618 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434638 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434655 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434670 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434686 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434706 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434723 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434740 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434759 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434778 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434798 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434815 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.434830 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.435385 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.435997 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436829 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436863 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436884 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436901 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436919 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.436935 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437022 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437047 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437089 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437111 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437128 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437206 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437262 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437280 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437301 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437431 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.437452 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441497 4795 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441547 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441573 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441591 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441609 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441628 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441642 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441656 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441669 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441682 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441695 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441708 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441721 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441736 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441826 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441846 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441865 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441878 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441899 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441916 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441931 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441948 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441962 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441976 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.441991 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442004 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442017 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442030 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442044 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442059 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442095 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442121 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442146 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442165 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442182 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442200 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442217 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442235 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442273 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442288 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442300 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442316 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442333 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442352 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442370 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442392 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442413 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442428 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442440 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442454 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442467 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442479 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442491 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442504 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442517 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442529 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442541 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442556 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442568 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442581 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442593 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442606 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442619 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442631 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442644 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442656 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442668 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442681 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442694 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442706 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442729 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442743 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442759 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442772 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442785 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442801 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442816 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442830 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442844 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442858 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442872 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442885 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442897 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442910 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442923 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442936 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442948 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442961 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442973 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442987 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.442999 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443011 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443023 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443036 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443050 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443097 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443113 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443126 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443138 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443152 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443167 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443178 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443192 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443204 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443217 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443229 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443249 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443262 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443274 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443287 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443298 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443311 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443324 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443336 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443348 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443360 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443372 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443384 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443395 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443410 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443422 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443436 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443454 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443471 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443488 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443505 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443518 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443530 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443542 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443554 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443567 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443580 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443592 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443604 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443616 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443629 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443641 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443652 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443668 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443691 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443704 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443718 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443730 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443742 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443754 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443767 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443779 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443792 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443804 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443816 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443827 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443839 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443853 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443880 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443893 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443906 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443917 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443931 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443943 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443956 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443969 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443983 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.443997 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.444010 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.444023 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.444034 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.444047 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.444059 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445464 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445515 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445545 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445567 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445583 4795 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445596 4795 reconstruct.go:97] "Volume reconstruction finished" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.445605 4795 reconciler.go:26] "Reconciler: start to sync state" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.453000 4795 manager.go:324] Recovery completed Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.467742 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.469159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.469204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.469216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.470160 4795 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.470175 4795 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.470192 4795 state_mem.go:36] "Initialized new in-memory state store" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.472881 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.474817 4795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.475250 4795 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.475285 4795 kubelet.go:2335] "Starting kubelet main sync loop" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.475394 4795 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.476164 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.476223 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.490515 4795 policy_none.go:49] "None policy: Start" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.491160 4795 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.491188 4795 state_mem.go:35] "Initializing new in-memory state store" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.520504 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.543457 4795 manager.go:334] "Starting Device Plugin manager" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.543543 4795 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.543562 4795 server.go:79] "Starting device plugin registration server" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.544228 4795 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.544276 4795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.545605 4795 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.545734 4795 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.545743 4795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.552829 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.576035 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.576131 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577367 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.577719 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578408 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.578610 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579263 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579438 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579464 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.579701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580577 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580721 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.580949 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.581910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.581930 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.581938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.582166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.582194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.582203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.582410 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.582444 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.584591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.584623 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.584634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.625320 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.644872 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.645982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.646034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.646048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.646095 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647285 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647329 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647849 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.647988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.648016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.648042 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.748759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.748856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.748903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.748952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749157 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749233 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749328 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749381 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749442 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749500 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749530 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749583 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749688 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749757 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.749947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.847507 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.848493 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.849628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.849666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.849678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.849706 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:07 crc kubenswrapper[4795]: E0310 15:06:07.850156 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.914058 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.928839 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.946111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.962620 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-942589ddc260fff48bb341e0189115c65630424f630432ec94194a3440a600d7 WatchSource:0}: Error finding container 942589ddc260fff48bb341e0189115c65630424f630432ec94194a3440a600d7: Status 404 returned error can't find the container with id 942589ddc260fff48bb341e0189115c65630424f630432ec94194a3440a600d7 Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.962988 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7b800515f793b60d167df09812fabf8430781185ceae8e69fa9e85827144b51d WatchSource:0}: Error finding container 7b800515f793b60d167df09812fabf8430781185ceae8e69fa9e85827144b51d: Status 404 returned error can't find the container with id 7b800515f793b60d167df09812fabf8430781185ceae8e69fa9e85827144b51d Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.964157 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: I0310 15:06:07.972337 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:07 crc kubenswrapper[4795]: W0310 15:06:07.984310 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bf7cef3b96d2c15467af6cde1db36874b5b30275a88a8f3647d19f2367b82cbc WatchSource:0}: Error finding container bf7cef3b96d2c15467af6cde1db36874b5b30275a88a8f3647d19f2367b82cbc: Status 404 returned error can't find the container with id bf7cef3b96d2c15467af6cde1db36874b5b30275a88a8f3647d19f2367b82cbc Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.026800 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.250441 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:08 crc kubenswrapper[4795]: W0310 15:06:08.250973 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.251034 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.251694 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.251729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.251744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.251769 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.252125 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.422895 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.479374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7b800515f793b60d167df09812fabf8430781185ceae8e69fa9e85827144b51d"} Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.480372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6420f9302741b442b2c4c97868361e12d8c802a9d7457b530ebdfceb70e14acc"} Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.481744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bf7cef3b96d2c15467af6cde1db36874b5b30275a88a8f3647d19f2367b82cbc"} Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.483280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"144dac2e234eeb5ac535841053bc934500319cf266ca3c071dbcbe4a5897947e"} Mar 10 15:06:08 crc kubenswrapper[4795]: I0310 15:06:08.484233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"942589ddc260fff48bb341e0189115c65630424f630432ec94194a3440a600d7"} Mar 10 15:06:08 crc kubenswrapper[4795]: W0310 15:06:08.617780 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.617872 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:08 crc kubenswrapper[4795]: W0310 15:06:08.668140 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.668511 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:08 crc kubenswrapper[4795]: E0310 15:06:08.827829 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 10 15:06:09 crc kubenswrapper[4795]: W0310 15:06:09.045807 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:09 crc kubenswrapper[4795]: E0310 15:06:09.045887 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.052202 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.054678 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.054741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.054766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.054804 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:09 crc kubenswrapper[4795]: E0310 15:06:09.055480 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.422387 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.488220 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886" exitCode=0 Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.488296 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.488340 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.489249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.489300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.489318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.490305 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812" exitCode=0 Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.490358 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.490424 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.491576 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.491637 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49" exitCode=0 Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.491692 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.491770 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492790 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.492832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.493302 4795 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3" exitCode=0 Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.493335 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.493369 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.493996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.494023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.494036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496201 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496241 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2"} Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496282 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.496857 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.497482 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.497520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:09 crc kubenswrapper[4795]: I0310 15:06:09.497536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:09 crc kubenswrapper[4795]: E0310 15:06:09.497672 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:10 crc kubenswrapper[4795]: W0310 15:06:10.224104 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:10 crc kubenswrapper[4795]: E0310 15:06:10.224187 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.422205 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:10 crc kubenswrapper[4795]: E0310 15:06:10.428664 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.501046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.501186 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.502223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.502255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.502268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.503794 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.503858 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.503871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.503881 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.505022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.505095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.505123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"817ad8857f1a0762ad3104f345eb347be0a4b26f36178666a6435a32fbc37dcb"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507314 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507345 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507363 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.507381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.508226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.508255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.508266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.508989 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c" exitCode=0 Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509085 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509099 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c"} Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.509839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.512519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.512625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.512640 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.656547 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.657653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.657690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.657705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:10 crc kubenswrapper[4795]: I0310 15:06:10.657731 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:10 crc kubenswrapper[4795]: E0310 15:06:10.658284 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.32:6443: connect: connection refused" node="crc" Mar 10 15:06:10 crc kubenswrapper[4795]: W0310 15:06:10.750782 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.32:6443: connect: connection refused Mar 10 15:06:10 crc kubenswrapper[4795]: E0310 15:06:10.750862 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.32:6443: connect: connection refused" logger="UnhandledError" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.102015 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.512842 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.514853 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="817ad8857f1a0762ad3104f345eb347be0a4b26f36178666a6435a32fbc37dcb" exitCode=255 Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.514923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"817ad8857f1a0762ad3104f345eb347be0a4b26f36178666a6435a32fbc37dcb"} Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.514937 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.515726 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.515757 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.515767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.516248 4795 scope.go:117] "RemoveContainer" containerID="817ad8857f1a0762ad3104f345eb347be0a4b26f36178666a6435a32fbc37dcb" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518020 4795 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67" exitCode=0 Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518048 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67"} Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518115 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518146 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518176 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.518361 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.519836 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.519884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.519900 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.519982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.520044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.520107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.520145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.520187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.520209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.889134 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.889281 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.890905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.890932 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:11 crc kubenswrapper[4795]: I0310 15:06:11.890942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.524370 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.527242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf"} Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.527341 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.528326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.528376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.528399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.532743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544"} Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.532804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04"} Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.532831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816"} Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.532853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731"} Mar 10 15:06:12 crc kubenswrapper[4795]: I0310 15:06:12.938879 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.209787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.210043 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.211776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.211837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.211855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.544293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50"} Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.544456 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.544503 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.544621 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.546519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.822190 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.859323 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.861252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.861315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.861335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:13 crc kubenswrapper[4795]: I0310 15:06:13.861373 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.061819 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.062445 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.064154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.064217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.064239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.073130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.127478 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.127735 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.128903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.128933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.128945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.488131 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.547688 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.547730 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.548381 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:14 crc kubenswrapper[4795]: I0310 15:06:14.549803 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.265640 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.469222 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.551046 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.551464 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.551465 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:15 crc kubenswrapper[4795]: I0310 15:06:15.553558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:17 crc kubenswrapper[4795]: I0310 15:06:17.488960 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:17 crc kubenswrapper[4795]: I0310 15:06:17.489329 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:17 crc kubenswrapper[4795]: E0310 15:06:17.553271 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:19 crc kubenswrapper[4795]: I0310 15:06:19.162870 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 10 15:06:19 crc kubenswrapper[4795]: I0310 15:06:19.163098 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:19 crc kubenswrapper[4795]: I0310 15:06:19.166670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:19 crc kubenswrapper[4795]: I0310 15:06:19.167229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:19 crc kubenswrapper[4795]: I0310 15:06:19.167251 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.102971 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.103671 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 10 15:06:21 crc kubenswrapper[4795]: W0310 15:06:21.285787 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.285867 4795 trace.go:236] Trace[578431138]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 15:06:11.284) (total time: 10001ms): Mar 10 15:06:21 crc kubenswrapper[4795]: Trace[578431138]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (15:06:21.285) Mar 10 15:06:21 crc kubenswrapper[4795]: Trace[578431138]: [10.001123727s] [10.001123727s] END Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.285885 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4795]: W0310 15:06:21.290516 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.290644 4795 trace.go:236] Trace[1241895902]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (10-Mar-2026 15:06:11.289) (total time: 10000ms): Mar 10 15:06:21 crc kubenswrapper[4795]: Trace[1241895902]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (15:06:21.290) Mar 10 15:06:21 crc kubenswrapper[4795]: Trace[1241895902]: [10.000992702s] [10.000992702s] END Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.290678 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.423032 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.882789 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:21 crc kubenswrapper[4795]: W0310 15:06:21.884133 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.884236 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.888699 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 10 15:06:21 crc kubenswrapper[4795]: W0310 15:06:21.892435 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.892508 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.900329 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:21 crc kubenswrapper[4795]: E0310 15:06:21.906632 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.908829 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.908896 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.921286 4795 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:06:21 crc kubenswrapper[4795]: I0310 15:06:21.921355 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.426634 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:22Z is after 2026-02-23T05:33:13Z Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.570482 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.571363 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.574375 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" exitCode=255 Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.574449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf"} Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.574540 4795 scope.go:117] "RemoveContainer" containerID="817ad8857f1a0762ad3104f345eb347be0a4b26f36178666a6435a32fbc37dcb" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.574826 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.576566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.576639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.576664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:22 crc kubenswrapper[4795]: I0310 15:06:22.577538 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:22 crc kubenswrapper[4795]: E0310 15:06:22.577884 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.218681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.218830 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.219968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.220044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.220054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.425357 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:23Z is after 2026-02-23T05:33:13Z Mar 10 15:06:23 crc kubenswrapper[4795]: I0310 15:06:23.579794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:06:24 crc kubenswrapper[4795]: I0310 15:06:24.427287 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:24Z is after 2026-02-23T05:33:13Z Mar 10 15:06:24 crc kubenswrapper[4795]: W0310 15:06:24.760707 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:24Z is after 2026-02-23T05:33:13Z Mar 10 15:06:24 crc kubenswrapper[4795]: E0310 15:06:24.761410 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.274034 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.274327 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.275898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.275960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.275986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.276950 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:25 crc kubenswrapper[4795]: E0310 15:06:25.277326 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.284537 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.425508 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:25Z is after 2026-02-23T05:33:13Z Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.587313 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.588778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.588828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.588842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:25 crc kubenswrapper[4795]: I0310 15:06:25.589488 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:25 crc kubenswrapper[4795]: E0310 15:06:25.589682 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:26 crc kubenswrapper[4795]: W0310 15:06:26.217138 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:26Z is after 2026-02-23T05:33:13Z Mar 10 15:06:26 crc kubenswrapper[4795]: E0310 15:06:26.217221 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:26 crc kubenswrapper[4795]: I0310 15:06:26.427114 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:26Z is after 2026-02-23T05:33:13Z Mar 10 15:06:27 crc kubenswrapper[4795]: I0310 15:06:27.424878 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:27Z is after 2026-02-23T05:33:13Z Mar 10 15:06:27 crc kubenswrapper[4795]: I0310 15:06:27.489104 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:27 crc kubenswrapper[4795]: I0310 15:06:27.489247 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:27 crc kubenswrapper[4795]: E0310 15:06:27.553431 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:28 crc kubenswrapper[4795]: E0310 15:06:28.292670 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.307013 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.308255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.308293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.308306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.308328 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:28 crc kubenswrapper[4795]: E0310 15:06:28.311253 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.427584 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:28Z is after 2026-02-23T05:33:13Z Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.863558 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.863758 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.864882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.865002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.865099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:28 crc kubenswrapper[4795]: I0310 15:06:28.865624 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:28 crc kubenswrapper[4795]: E0310 15:06:28.865903 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.186903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.187506 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.188965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.189003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.189021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.205502 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.425391 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:29Z is after 2026-02-23T05:33:13Z Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.597179 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.598920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.598953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:29 crc kubenswrapper[4795]: I0310 15:06:29.598961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:30 crc kubenswrapper[4795]: I0310 15:06:30.044827 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:30 crc kubenswrapper[4795]: E0310 15:06:30.050402 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:30 crc kubenswrapper[4795]: W0310 15:06:30.249146 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:30Z is after 2026-02-23T05:33:13Z Mar 10 15:06:30 crc kubenswrapper[4795]: E0310 15:06:30.249247 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:30Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:30 crc kubenswrapper[4795]: I0310 15:06:30.427548 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:30Z is after 2026-02-23T05:33:13Z Mar 10 15:06:31 crc kubenswrapper[4795]: I0310 15:06:31.428502 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:31Z is after 2026-02-23T05:33:13Z Mar 10 15:06:31 crc kubenswrapper[4795]: E0310 15:06:31.889235 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:31Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:32 crc kubenswrapper[4795]: I0310 15:06:32.427545 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:32Z is after 2026-02-23T05:33:13Z Mar 10 15:06:32 crc kubenswrapper[4795]: W0310 15:06:32.502738 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:32Z is after 2026-02-23T05:33:13Z Mar 10 15:06:32 crc kubenswrapper[4795]: E0310 15:06:32.502857 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:33 crc kubenswrapper[4795]: I0310 15:06:33.427324 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:33Z is after 2026-02-23T05:33:13Z Mar 10 15:06:34 crc kubenswrapper[4795]: I0310 15:06:34.428969 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:34Z is after 2026-02-23T05:33:13Z Mar 10 15:06:35 crc kubenswrapper[4795]: E0310 15:06:35.299043 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.311568 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.313411 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.313482 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.313503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.313563 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:35 crc kubenswrapper[4795]: E0310 15:06:35.319329 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:35 crc kubenswrapper[4795]: I0310 15:06:35.427479 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:35Z is after 2026-02-23T05:33:13Z Mar 10 15:06:35 crc kubenswrapper[4795]: W0310 15:06:35.918826 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:35Z is after 2026-02-23T05:33:13Z Mar 10 15:06:35 crc kubenswrapper[4795]: E0310 15:06:35.918924 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:36 crc kubenswrapper[4795]: I0310 15:06:36.428006 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:36Z is after 2026-02-23T05:33:13Z Mar 10 15:06:36 crc kubenswrapper[4795]: W0310 15:06:36.497531 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:36Z is after 2026-02-23T05:33:13Z Mar 10 15:06:36 crc kubenswrapper[4795]: E0310 15:06:36.497637 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.426257 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:37Z is after 2026-02-23T05:33:13Z Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.488829 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.488932 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.489029 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.489303 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.490905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.490960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.490979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.491664 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.491944 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b" gracePeriod=30 Mar 10 15:06:37 crc kubenswrapper[4795]: E0310 15:06:37.553768 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.625280 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.626009 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b" exitCode=255 Mar 10 15:06:37 crc kubenswrapper[4795]: I0310 15:06:37.626118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b"} Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.427669 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:38Z is after 2026-02-23T05:33:13Z Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.633022 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.633728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1"} Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.633861 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.635331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.635383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:38 crc kubenswrapper[4795]: I0310 15:06:38.635401 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:39 crc kubenswrapper[4795]: I0310 15:06:39.426355 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:39Z is after 2026-02-23T05:33:13Z Mar 10 15:06:39 crc kubenswrapper[4795]: I0310 15:06:39.636300 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:39 crc kubenswrapper[4795]: I0310 15:06:39.637796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:39 crc kubenswrapper[4795]: I0310 15:06:39.637848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:39 crc kubenswrapper[4795]: I0310 15:06:39.637883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:40 crc kubenswrapper[4795]: I0310 15:06:40.428735 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:40Z is after 2026-02-23T05:33:13Z Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.426357 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:41Z is after 2026-02-23T05:33:13Z Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.889702 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.889885 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.891670 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.891748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:41 crc kubenswrapper[4795]: I0310 15:06:41.891771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:41 crc kubenswrapper[4795]: E0310 15:06:41.896165 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:41Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:42 crc kubenswrapper[4795]: E0310 15:06:42.304557 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.320056 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.321594 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.321656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.321679 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.321719 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:42 crc kubenswrapper[4795]: E0310 15:06:42.326567 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.427441 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:42Z is after 2026-02-23T05:33:13Z Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.476135 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.477546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.477610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.477634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:42 crc kubenswrapper[4795]: I0310 15:06:42.478520 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.428168 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:43Z is after 2026-02-23T05:33:13Z Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.651014 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.651736 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.653813 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" exitCode=255 Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.653872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4"} Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.653932 4795 scope.go:117] "RemoveContainer" containerID="d6fd5b07ed414715dda9fb7e20938a158aaa818d5da9843c2fd87b80bcdc99bf" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.654218 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.656519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.656573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.656603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:43 crc kubenswrapper[4795]: I0310 15:06:43.657681 4795 scope.go:117] "RemoveContainer" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" Mar 10 15:06:43 crc kubenswrapper[4795]: E0310 15:06:43.658062 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.429425 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:44Z is after 2026-02-23T05:33:13Z Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.488895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.489128 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.490467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.490499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.490512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:44 crc kubenswrapper[4795]: I0310 15:06:44.658874 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:06:45 crc kubenswrapper[4795]: I0310 15:06:45.427441 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:45Z is after 2026-02-23T05:33:13Z Mar 10 15:06:46 crc kubenswrapper[4795]: I0310 15:06:46.280728 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:06:46 crc kubenswrapper[4795]: E0310 15:06:46.286885 4795 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:46 crc kubenswrapper[4795]: E0310 15:06:46.288052 4795 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 10 15:06:46 crc kubenswrapper[4795]: I0310 15:06:46.426735 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:46Z is after 2026-02-23T05:33:13Z Mar 10 15:06:47 crc kubenswrapper[4795]: I0310 15:06:47.426208 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:47Z is after 2026-02-23T05:33:13Z Mar 10 15:06:47 crc kubenswrapper[4795]: I0310 15:06:47.489296 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:47 crc kubenswrapper[4795]: I0310 15:06:47.489446 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:47 crc kubenswrapper[4795]: E0310 15:06:47.553934 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.426535 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:48Z is after 2026-02-23T05:33:13Z Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.863678 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.863911 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.865546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.865610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.865634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:48 crc kubenswrapper[4795]: I0310 15:06:48.866540 4795 scope.go:117] "RemoveContainer" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" Mar 10 15:06:48 crc kubenswrapper[4795]: E0310 15:06:48.866911 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:49 crc kubenswrapper[4795]: E0310 15:06:49.310956 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.327665 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.330101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.330174 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.330197 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.330242 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:49 crc kubenswrapper[4795]: E0310 15:06:49.335847 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:49 crc kubenswrapper[4795]: I0310 15:06:49.426722 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:49Z is after 2026-02-23T05:33:13Z Mar 10 15:06:49 crc kubenswrapper[4795]: W0310 15:06:49.577371 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:49Z is after 2026-02-23T05:33:13Z Mar 10 15:06:49 crc kubenswrapper[4795]: E0310 15:06:49.577807 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:50 crc kubenswrapper[4795]: I0310 15:06:50.427042 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:50Z is after 2026-02-23T05:33:13Z Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.102121 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.102290 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.103231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.103294 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.103310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.104120 4795 scope.go:117] "RemoveContainer" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" Mar 10 15:06:51 crc kubenswrapper[4795]: E0310 15:06:51.104401 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:06:51 crc kubenswrapper[4795]: I0310 15:06:51.428915 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:51Z is after 2026-02-23T05:33:13Z Mar 10 15:06:51 crc kubenswrapper[4795]: E0310 15:06:51.903194 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:51Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:06:52 crc kubenswrapper[4795]: I0310 15:06:52.428155 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:52Z is after 2026-02-23T05:33:13Z Mar 10 15:06:53 crc kubenswrapper[4795]: I0310 15:06:53.427612 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:53Z is after 2026-02-23T05:33:13Z Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.135486 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.135672 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.137004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.137053 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.137100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:54 crc kubenswrapper[4795]: W0310 15:06:54.252973 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:54Z is after 2026-02-23T05:33:13Z Mar 10 15:06:54 crc kubenswrapper[4795]: E0310 15:06:54.253054 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:54 crc kubenswrapper[4795]: I0310 15:06:54.426467 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:54Z is after 2026-02-23T05:33:13Z Mar 10 15:06:55 crc kubenswrapper[4795]: I0310 15:06:55.425733 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:55Z is after 2026-02-23T05:33:13Z Mar 10 15:06:56 crc kubenswrapper[4795]: E0310 15:06:56.320219 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.336525 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.338146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.338403 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.338598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.338819 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:06:56 crc kubenswrapper[4795]: E0310 15:06:56.342429 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:06:56 crc kubenswrapper[4795]: I0310 15:06:56.427993 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:56Z is after 2026-02-23T05:33:13Z Mar 10 15:06:56 crc kubenswrapper[4795]: W0310 15:06:56.981893 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:56Z is after 2026-02-23T05:33:13Z Mar 10 15:06:56 crc kubenswrapper[4795]: E0310 15:06:56.982021 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:06:57 crc kubenswrapper[4795]: I0310 15:06:57.428345 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:57Z is after 2026-02-23T05:33:13Z Mar 10 15:06:57 crc kubenswrapper[4795]: I0310 15:06:57.488965 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:06:57 crc kubenswrapper[4795]: I0310 15:06:57.489063 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:06:57 crc kubenswrapper[4795]: E0310 15:06:57.554058 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:06:58 crc kubenswrapper[4795]: I0310 15:06:58.426644 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:58Z is after 2026-02-23T05:33:13Z Mar 10 15:06:59 crc kubenswrapper[4795]: I0310 15:06:59.427461 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:59Z is after 2026-02-23T05:33:13Z Mar 10 15:06:59 crc kubenswrapper[4795]: W0310 15:06:59.750390 4795 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:59Z is after 2026-02-23T05:33:13Z Mar 10 15:06:59 crc kubenswrapper[4795]: E0310 15:06:59.750641 4795 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:06:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 10 15:07:00 crc kubenswrapper[4795]: I0310 15:07:00.427786 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:00Z is after 2026-02-23T05:33:13Z Mar 10 15:07:01 crc kubenswrapper[4795]: I0310 15:07:01.428331 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:01Z is after 2026-02-23T05:33:13Z Mar 10 15:07:01 crc kubenswrapper[4795]: E0310 15:07:01.909349 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:01Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:02 crc kubenswrapper[4795]: I0310 15:07:02.427698 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:02Z is after 2026-02-23T05:33:13Z Mar 10 15:07:03 crc kubenswrapper[4795]: E0310 15:07:03.326449 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.343494 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.345669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.345756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.345778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.345844 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:07:03 crc kubenswrapper[4795]: E0310 15:07:03.351831 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 10 15:07:03 crc kubenswrapper[4795]: I0310 15:07:03.426946 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:03Z is after 2026-02-23T05:33:13Z Mar 10 15:07:04 crc kubenswrapper[4795]: I0310 15:07:04.428668 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:05 crc kubenswrapper[4795]: I0310 15:07:05.427902 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.428307 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.476290 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.477499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.477542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.477557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.478237 4795 scope.go:117] "RemoveContainer" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.729141 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.732009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38"} Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.732185 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.733010 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.733040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:06 crc kubenswrapper[4795]: I0310 15:07:06.733051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.429652 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.489225 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.489284 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.489334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.489461 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.490537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.490611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.490620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.491049 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.491165 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1" gracePeriod=30 Mar 10 15:07:07 crc kubenswrapper[4795]: E0310 15:07:07.555399 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.737412 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.743664 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.745699 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1" exitCode=255 Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.745757 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1"} Mar 10 15:07:07 crc kubenswrapper[4795]: I0310 15:07:07.745816 4795 scope.go:117] "RemoveContainer" containerID="368c3e594b6c4349320505cc7816a8d2f0547e88886262f47ab540297abffc9b" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.427766 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.754414 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.756413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa"} Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.756506 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.758024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.758097 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.758111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.759848 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.760812 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.763536 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" exitCode=255 Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.763588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38"} Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.763641 4795 scope.go:117] "RemoveContainer" containerID="4306cea62ee43b6078e42a55a42bdb5acd924332b4d136024be01504b6ffe9f4" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.763773 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.764760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.764798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.764812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.765519 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:08 crc kubenswrapper[4795]: E0310 15:07:08.765757 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:08 crc kubenswrapper[4795]: I0310 15:07:08.863625 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.426897 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.769633 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.772476 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.773753 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.778168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.778264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.778291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.779166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.779240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.779309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:09 crc kubenswrapper[4795]: I0310 15:07:09.780791 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:09 crc kubenswrapper[4795]: E0310 15:07:09.780991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:10 crc kubenswrapper[4795]: E0310 15:07:10.336506 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.352602 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.354892 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.354950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.354968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.355004 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:07:10 crc kubenswrapper[4795]: E0310 15:07:10.360612 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:07:10 crc kubenswrapper[4795]: I0310 15:07:10.427881 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.102094 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.102351 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.103877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.104144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.104361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.105401 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.105863 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.428189 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.889599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.890021 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.891517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.891580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:11 crc kubenswrapper[4795]: I0310 15:07:11.891598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.915722 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833aed3dc0a6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,LastTimestamp:2026-03-10 15:06:07.412011174 +0000 UTC m=+0.577752072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.920852 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.925304 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.931300 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.937362 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af546411a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.546786074 +0000 UTC m=+0.712526972,LastTimestamp:2026-03-10 15:06:07.546786074 +0000 UTC m=+0.712526972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.941710 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.577186627 +0000 UTC m=+0.742927515,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.946195 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.577200887 +0000 UTC m=+0.742941785,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.953091 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.577209548 +0000 UTC m=+0.742950446,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.956770 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.578289199 +0000 UTC m=+0.744030097,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.960421 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.57830445 +0000 UTC m=+0.744045348,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.963638 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.578311811 +0000 UTC m=+0.744052699,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.967272 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.578422266 +0000 UTC m=+0.744163164,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.970728 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.578432406 +0000 UTC m=+0.744173304,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.976286 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.578440467 +0000 UTC m=+0.744181355,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.978467 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.57913504 +0000 UTC m=+0.744875928,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.980193 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.579150431 +0000 UTC m=+0.744891329,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.981597 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.579158141 +0000 UTC m=+0.744899039,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.983449 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.579646584 +0000 UTC m=+0.745387482,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.985778 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.579664295 +0000 UTC m=+0.745405193,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.986906 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.579707637 +0000 UTC m=+0.745448535,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.989538 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.580124387 +0000 UTC m=+0.745865285,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.990909 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.580133538 +0000 UTC m=+0.745874436,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.993647 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6b94a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6b94a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469222218 +0000 UTC m=+0.634963116,LastTimestamp:2026-03-10 15:06:07.580141048 +0000 UTC m=+0.745881946,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:11 crc kubenswrapper[4795]: E0310 15:07:11.996791 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a62d98\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a62d98 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469186456 +0000 UTC m=+0.634927354,LastTimestamp:2026-03-10 15:06:07.58059093 +0000 UTC m=+0.746331828,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.000350 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b833af0a6926b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b833af0a6926b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.469212267 +0000 UTC m=+0.634953155,LastTimestamp:2026-03-10 15:06:07.58059992 +0000 UTC m=+0.746340818,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.007021 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b0e9533cd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.971390413 +0000 UTC m=+1.137131361,LastTimestamp:2026-03-10 15:06:07.971390413 +0000 UTC m=+1.137131361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.011338 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b0ea1d740 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.972218688 +0000 UTC m=+1.137959616,LastTimestamp:2026-03-10 15:06:07.972218688 +0000 UTC m=+1.137959616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.014281 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b0ee28fb0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.976460208 +0000 UTC m=+1.142201116,LastTimestamp:2026-03-10 15:06:07.976460208 +0000 UTC m=+1.142201116,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.017226 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b0f91eb1d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:07.987952413 +0000 UTC m=+1.153693311,LastTimestamp:2026-03-10 15:06:07.987952413 +0000 UTC m=+1.153693311,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.021057 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b10778d88 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.003001736 +0000 UTC m=+1.168742634,LastTimestamp:2026-03-10 15:06:08.003001736 +0000 UTC m=+1.168742634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.025154 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b2f9dcd16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.52560207 +0000 UTC m=+1.691342968,LastTimestamp:2026-03-10 15:06:08.52560207 +0000 UTC m=+1.691342968,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.030770 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b2fa7d4d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.526259412 +0000 UTC m=+1.692000310,LastTimestamp:2026-03-10 15:06:08.526259412 +0000 UTC m=+1.692000310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.036170 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b2fb0a5cc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.526837196 +0000 UTC m=+1.692578104,LastTimestamp:2026-03-10 15:06:08.526837196 +0000 UTC m=+1.692578104,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.041804 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b2fcd63d5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.528720853 +0000 UTC m=+1.694461751,LastTimestamp:2026-03-10 15:06:08.528720853 +0000 UTC m=+1.694461751,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.047438 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b2fd3cf52 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.529141586 +0000 UTC m=+1.694882484,LastTimestamp:2026-03-10 15:06:08.529141586 +0000 UTC m=+1.694882484,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.051741 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b303a6e5a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.53586697 +0000 UTC m=+1.701607868,LastTimestamp:2026-03-10 15:06:08.53586697 +0000 UTC m=+1.701607868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.056502 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b3051852a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.537380138 +0000 UTC m=+1.703121036,LastTimestamp:2026-03-10 15:06:08.537380138 +0000 UTC m=+1.703121036,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.061103 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b30569782 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.537712514 +0000 UTC m=+1.703453412,LastTimestamp:2026-03-10 15:06:08.537712514 +0000 UTC m=+1.703453412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.065849 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b3072eca6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.539569318 +0000 UTC m=+1.705310216,LastTimestamp:2026-03-10 15:06:08.539569318 +0000 UTC m=+1.705310216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.071179 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b309e36c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.542406339 +0000 UTC m=+1.708147237,LastTimestamp:2026-03-10 15:06:08.542406339 +0000 UTC m=+1.708147237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.075888 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b30a9ff99 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.543178649 +0000 UTC m=+1.708919567,LastTimestamp:2026-03-10 15:06:08.543178649 +0000 UTC m=+1.708919567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.080521 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b405dee60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.80662896 +0000 UTC m=+1.972369858,LastTimestamp:2026-03-10 15:06:08.80662896 +0000 UTC m=+1.972369858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.081781 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b40ec92d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.815977168 +0000 UTC m=+1.981718066,LastTimestamp:2026-03-10 15:06:08.815977168 +0000 UTC m=+1.981718066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.084385 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b40fb0e86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.816926342 +0000 UTC m=+1.982667240,LastTimestamp:2026-03-10 15:06:08.816926342 +0000 UTC m=+1.982667240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.087487 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b4b3f63a2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.989176738 +0000 UTC m=+2.154917646,LastTimestamp:2026-03-10 15:06:08.989176738 +0000 UTC m=+2.154917646,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.089686 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b4bc9af86 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.998240134 +0000 UTC m=+2.163981032,LastTimestamp:2026-03-10 15:06:08.998240134 +0000 UTC m=+2.163981032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.092843 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b4bd7e65d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.999171677 +0000 UTC m=+2.164912575,LastTimestamp:2026-03-10 15:06:08.999171677 +0000 UTC m=+2.164912575,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.095674 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b55206e7a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.154920058 +0000 UTC m=+2.320660956,LastTimestamp:2026-03-10 15:06:09.154920058 +0000 UTC m=+2.320660956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.098260 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b55b06e8f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.164357263 +0000 UTC m=+2.330098161,LastTimestamp:2026-03-10 15:06:09.164357263 +0000 UTC m=+2.330098161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.100734 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b692da83b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.491331131 +0000 UTC m=+2.657072029,LastTimestamp:2026-03-10 15:06:09.491331131 +0000 UTC m=+2.657072029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.102195 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b694f984a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.493555274 +0000 UTC m=+2.659296172,LastTimestamp:2026-03-10 15:06:09.493555274 +0000 UTC m=+2.659296172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.111205 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b6956d689 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.494029961 +0000 UTC m=+2.659770859,LastTimestamp:2026-03-10 15:06:09.494029961 +0000 UTC m=+2.659770859,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.117431 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b69646051 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.494917201 +0000 UTC m=+2.660658099,LastTimestamp:2026-03-10 15:06:09.494917201 +0000 UTC m=+2.660658099,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.123021 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b775d157e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.729320318 +0000 UTC m=+2.895061216,LastTimestamp:2026-03-10 15:06:09.729320318 +0000 UTC m=+2.895061216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.126471 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b77796e4f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.731178063 +0000 UTC m=+2.896918961,LastTimestamp:2026-03-10 15:06:09.731178063 +0000 UTC m=+2.896918961,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.129541 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b77886905 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.732159749 +0000 UTC m=+2.897900647,LastTimestamp:2026-03-10 15:06:09.732159749 +0000 UTC m=+2.897900647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.132544 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b77c562e9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.736155881 +0000 UTC m=+2.901896779,LastTimestamp:2026-03-10 15:06:09.736155881 +0000 UTC m=+2.901896779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.135763 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b77dd7e90 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.737735824 +0000 UTC m=+2.903476722,LastTimestamp:2026-03-10 15:06:09.737735824 +0000 UTC m=+2.903476722,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.139009 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b77edf6ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.738815178 +0000 UTC m=+2.904556076,LastTimestamp:2026-03-10 15:06:09.738815178 +0000 UTC m=+2.904556076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.143037 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b79623ca0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.763212448 +0000 UTC m=+2.928953346,LastTimestamp:2026-03-10 15:06:09.763212448 +0000 UTC m=+2.928953346,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.147090 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833b796e04b8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.763984568 +0000 UTC m=+2.929725476,LastTimestamp:2026-03-10 15:06:09.763984568 +0000 UTC m=+2.929725476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.150589 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b79716a02 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.764207106 +0000 UTC m=+2.929948004,LastTimestamp:2026-03-10 15:06:09.764207106 +0000 UTC m=+2.929948004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.153963 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b833b7999df2e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.766858542 +0000 UTC m=+2.932599430,LastTimestamp:2026-03-10 15:06:09.766858542 +0000 UTC m=+2.932599430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.157677 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b827afd8d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.915829645 +0000 UTC m=+3.081570543,LastTimestamp:2026-03-10 15:06:09.915829645 +0000 UTC m=+3.081570543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.161002 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b8290733d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.917236029 +0000 UTC m=+3.082976927,LastTimestamp:2026-03-10 15:06:09.917236029 +0000 UTC m=+3.082976927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.164012 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b83503110 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.929802 +0000 UTC m=+3.095542898,LastTimestamp:2026-03-10 15:06:09.929802 +0000 UTC m=+3.095542898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.167102 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b83503124 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.92980202 +0000 UTC m=+3.095542918,LastTimestamp:2026-03-10 15:06:09.92980202 +0000 UTC m=+3.095542918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.172035 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b835b22e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.930519272 +0000 UTC m=+3.096260170,LastTimestamp:2026-03-10 15:06:09.930519272 +0000 UTC m=+3.096260170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.177149 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b8360dbcc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:09.930894284 +0000 UTC m=+3.096635182,LastTimestamp:2026-03-10 15:06:09.930894284 +0000 UTC m=+3.096635182,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.180323 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b8cb08041 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.087108673 +0000 UTC m=+3.252849571,LastTimestamp:2026-03-10 15:06:10.087108673 +0000 UTC m=+3.252849571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.185769 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b8ccecb7f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.089094015 +0000 UTC m=+3.254834913,LastTimestamp:2026-03-10 15:06:10.089094015 +0000 UTC m=+3.254834913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.189273 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b8d989c8a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.102320266 +0000 UTC m=+3.268061154,LastTimestamp:2026-03-10 15:06:10.102320266 +0000 UTC m=+3.268061154,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.192495 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b8da92d2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.10340587 +0000 UTC m=+3.269146768,LastTimestamp:2026-03-10 15:06:10.10340587 +0000 UTC m=+3.269146768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.197632 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b833b8db3c3c5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.104099781 +0000 UTC m=+3.269840679,LastTimestamp:2026-03-10 15:06:10.104099781 +0000 UTC m=+3.269840679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.200789 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b9687f8f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.252224759 +0000 UTC m=+3.417965657,LastTimestamp:2026-03-10 15:06:10.252224759 +0000 UTC m=+3.417965657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.204442 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b97c6a015 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.273107989 +0000 UTC m=+3.438848887,LastTimestamp:2026-03-10 15:06:10.273107989 +0000 UTC m=+3.438848887,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.209475 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b97d587ab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.274084779 +0000 UTC m=+3.439825677,LastTimestamp:2026-03-10 15:06:10.274084779 +0000 UTC m=+3.439825677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.212505 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833ba15551c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.433454537 +0000 UTC m=+3.599195435,LastTimestamp:2026-03-10 15:06:10.433454537 +0000 UTC m=+3.599195435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.215385 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833ba21c1368 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.446480232 +0000 UTC m=+3.612221130,LastTimestamp:2026-03-10 15:06:10.446480232 +0000 UTC m=+3.612221130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.219241 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833ba62b6cd7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.514595031 +0000 UTC m=+3.680335929,LastTimestamp:2026-03-10 15:06:10.514595031 +0000 UTC m=+3.680335929,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.222282 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bb45ef752 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.752853842 +0000 UTC m=+3.918594750,LastTimestamp:2026-03-10 15:06:10.752853842 +0000 UTC m=+3.918594750,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.225032 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bb55c4174 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.769453428 +0000 UTC m=+3.935194326,LastTimestamp:2026-03-10 15:06:10.769453428 +0000 UTC m=+3.935194326,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.231031 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b833b97d587ab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833b97d587ab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.274084779 +0000 UTC m=+3.439825677,LastTimestamp:2026-03-10 15:06:11.51789714 +0000 UTC m=+4.683638038,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.235482 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833be25e3ee7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.524558567 +0000 UTC m=+4.690299505,LastTimestamp:2026-03-10 15:06:11.524558567 +0000 UTC m=+4.690299505,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.238665 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b833ba15551c9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833ba15551c9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.433454537 +0000 UTC m=+3.599195435,LastTimestamp:2026-03-10 15:06:11.678024671 +0000 UTC m=+4.843765569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.243999 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bebc45408 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.682243592 +0000 UTC m=+4.847984490,LastTimestamp:2026-03-10 15:06:11.682243592 +0000 UTC m=+4.847984490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.247454 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b833ba21c1368\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833ba21c1368 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:10.446480232 +0000 UTC m=+3.612221130,LastTimestamp:2026-03-10 15:06:11.685282986 +0000 UTC m=+4.851023924,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.252832 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bec5856c8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.691943624 +0000 UTC m=+4.857684522,LastTimestamp:2026-03-10 15:06:11.691943624 +0000 UTC m=+4.857684522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.256152 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bec635a7a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.692665466 +0000 UTC m=+4.858406364,LastTimestamp:2026-03-10 15:06:11.692665466 +0000 UTC m=+4.858406364,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.259799 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bf6200aba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.856026298 +0000 UTC m=+5.021767196,LastTimestamp:2026-03-10 15:06:11.856026298 +0000 UTC m=+5.021767196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.263476 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bf6e6e6cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.869058763 +0000 UTC m=+5.034799661,LastTimestamp:2026-03-10 15:06:11.869058763 +0000 UTC m=+5.034799661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.266505 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833bf6f8e7ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:11.8702387 +0000 UTC m=+5.035979598,LastTimestamp:2026-03-10 15:06:11.8702387 +0000 UTC m=+5.035979598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.269933 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c01d3ecba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.052364474 +0000 UTC m=+5.218105372,LastTimestamp:2026-03-10 15:06:12.052364474 +0000 UTC m=+5.218105372,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.273399 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c028ec1b7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.064608695 +0000 UTC m=+5.230349593,LastTimestamp:2026-03-10 15:06:12.064608695 +0000 UTC m=+5.230349593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.278338 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c029efb52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.065672018 +0000 UTC m=+5.231412916,LastTimestamp:2026-03-10 15:06:12.065672018 +0000 UTC m=+5.231412916,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.281864 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c10226338 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.29238764 +0000 UTC m=+5.458128548,LastTimestamp:2026-03-10 15:06:12.29238764 +0000 UTC m=+5.458128548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.285481 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c111e6420 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.308902944 +0000 UTC m=+5.474643842,LastTimestamp:2026-03-10 15:06:12.308902944 +0000 UTC m=+5.474643842,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.289623 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c112eedf8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.309986808 +0000 UTC m=+5.475727716,LastTimestamp:2026-03-10 15:06:12.309986808 +0000 UTC m=+5.475727716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.291264 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c1eff7a9b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.541758107 +0000 UTC m=+5.707499045,LastTimestamp:2026-03-10 15:06:12.541758107 +0000 UTC m=+5.707499045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.294578 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b833c1f974065 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:12.551704677 +0000 UTC m=+5.717445595,LastTimestamp:2026-03-10 15:06:12.551704677 +0000 UTC m=+5.717445595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.299680 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833d45e4c754 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:17.489286996 +0000 UTC m=+10.655027944,LastTimestamp:2026-03-10 15:06:17.489286996 +0000 UTC m=+10.655027944,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.303747 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833d45e63ee3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:17.489383139 +0000 UTC m=+10.655124077,LastTimestamp:2026-03-10 15:06:17.489383139 +0000 UTC m=+10.655124077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.307268 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189b833e1d53650f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:21.103637775 +0000 UTC m=+14.269378713,LastTimestamp:2026-03-10 15:06:21.103637775 +0000 UTC m=+14.269378713,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.310458 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833e1d549d0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:21.103717647 +0000 UTC m=+14.269458555,LastTimestamp:2026-03-10 15:06:21.103717647 +0000 UTC m=+14.269458555,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.313649 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-apiserver-crc.189b833e4d526033 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 10 15:07:12 crc kubenswrapper[4795]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 10 15:07:12 crc kubenswrapper[4795]: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:21.908877363 +0000 UTC m=+15.074618271,LastTimestamp:2026-03-10 15:06:21.908877363 +0000 UTC m=+15.074618271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.316576 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b833e4d53154c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:21.908923724 +0000 UTC m=+15.074664632,LastTimestamp:2026-03-10 15:06:21.908923724 +0000 UTC m=+15.074664632,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.321112 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833f99efacb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489221814 +0000 UTC m=+20.654962752,LastTimestamp:2026-03-10 15:06:27.489221814 +0000 UTC m=+20.654962752,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.325649 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833f99f0bab8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489290936 +0000 UTC m=+20.655031864,LastTimestamp:2026-03-10 15:06:27.489290936 +0000 UTC m=+20.655031864,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.330982 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833f99efacb6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833f99efacb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489221814 +0000 UTC m=+20.654962752,LastTimestamp:2026-03-10 15:06:37.488905196 +0000 UTC m=+30.654646134,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.334986 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833f99f0bab8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833f99f0bab8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489290936 +0000 UTC m=+20.655031864,LastTimestamp:2026-03-10 15:06:37.488982218 +0000 UTC m=+30.654723156,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.339028 4795 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b8341ee24aa42 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:37.49191533 +0000 UTC m=+30.657656308,LastTimestamp:2026-03-10 15:06:37.49191533 +0000 UTC m=+30.657656308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.342882 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833b3051852a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b3051852a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.537380138 +0000 UTC m=+1.703121036,LastTimestamp:2026-03-10 15:06:37.616616049 +0000 UTC m=+30.782356977,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.347606 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833b405dee60\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b405dee60 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.80662896 +0000 UTC m=+1.972369858,LastTimestamp:2026-03-10 15:06:37.846868421 +0000 UTC m=+31.012609319,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.354577 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833b40ec92d0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833b40ec92d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:08.815977168 +0000 UTC m=+1.981718066,LastTimestamp:2026-03-10 15:06:37.858578086 +0000 UTC m=+31.024319014,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.360052 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833f99efacb6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833f99efacb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489221814 +0000 UTC m=+20.654962752,LastTimestamp:2026-03-10 15:06:47.489396224 +0000 UTC m=+40.655137162,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.363772 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833f99f0bab8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b833f99f0bab8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489290936 +0000 UTC m=+20.655031864,LastTimestamp:2026-03-10 15:06:47.489531638 +0000 UTC m=+40.655272576,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:07:12 crc kubenswrapper[4795]: E0310 15:07:12.368375 4795 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b833f99efacb6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 10 15:07:12 crc kubenswrapper[4795]: &Event{ObjectMeta:{kube-controller-manager-crc.189b833f99efacb6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 10 15:07:12 crc kubenswrapper[4795]: body: Mar 10 15:07:12 crc kubenswrapper[4795]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:06:27.489221814 +0000 UTC m=+20.654962752,LastTimestamp:2026-03-10 15:06:57.489036625 +0000 UTC m=+50.654777563,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 10 15:07:12 crc kubenswrapper[4795]: > Mar 10 15:07:12 crc kubenswrapper[4795]: I0310 15:07:12.428739 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:13 crc kubenswrapper[4795]: I0310 15:07:13.429689 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.430817 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.489106 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.489249 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.490336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.490387 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.490404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.493857 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.785004 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.786248 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.786366 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:14 crc kubenswrapper[4795]: I0310 15:07:14.786432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:15 crc kubenswrapper[4795]: I0310 15:07:15.427442 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:16 crc kubenswrapper[4795]: I0310 15:07:16.426954 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:17 crc kubenswrapper[4795]: E0310 15:07:17.341891 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.361241 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.362583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.362761 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.362888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.363049 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:07:17 crc kubenswrapper[4795]: E0310 15:07:17.367994 4795 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 10 15:07:17 crc kubenswrapper[4795]: I0310 15:07:17.427311 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:17 crc kubenswrapper[4795]: E0310 15:07:17.556253 4795 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 10 15:07:18 crc kubenswrapper[4795]: I0310 15:07:18.290328 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 10 15:07:18 crc kubenswrapper[4795]: I0310 15:07:18.303147 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:07:18 crc kubenswrapper[4795]: I0310 15:07:18.471259 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:19 crc kubenswrapper[4795]: I0310 15:07:19.427726 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:20 crc kubenswrapper[4795]: I0310 15:07:20.426699 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.427777 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.896772 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.897014 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.898742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.898926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:21 crc kubenswrapper[4795]: I0310 15:07:21.899060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:22 crc kubenswrapper[4795]: I0310 15:07:22.427169 4795 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 10 15:07:23 crc kubenswrapper[4795]: I0310 15:07:23.278148 4795 csr.go:261] certificate signing request csr-dbqxz is approved, waiting to be issued Mar 10 15:07:23 crc kubenswrapper[4795]: I0310 15:07:23.287331 4795 csr.go:257] certificate signing request csr-dbqxz is issued Mar 10 15:07:23 crc kubenswrapper[4795]: I0310 15:07:23.375760 4795 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.287368 4795 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.288377 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-05 21:38:30.723182257 +0000 UTC Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.288508 4795 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7230h31m6.434680626s for next certificate rotation Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.368718 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.370508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.370732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.370984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.371351 4795 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.381380 4795 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.381867 4795 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.382036 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.386414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.386478 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.386501 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.386529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.386554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.401255 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.412464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.412526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.412545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.412571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.412589 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.428697 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.437949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.438022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.438048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.438123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.438151 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.452771 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.459952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.459995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.460005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.460022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:24 crc kubenswrapper[4795]: I0310 15:07:24.460033 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:24Z","lastTransitionTime":"2026-03-10T15:07:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.471335 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.471508 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.471540 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.571591 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.672193 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.773312 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.873895 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:24 crc kubenswrapper[4795]: E0310 15:07:24.975385 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.076584 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.177983 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.279533 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.380132 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: I0310 15:07:25.475651 4795 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 10 15:07:25 crc kubenswrapper[4795]: I0310 15:07:25.477382 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:25 crc kubenswrapper[4795]: I0310 15:07:25.477432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:25 crc kubenswrapper[4795]: I0310 15:07:25.477460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:25 crc kubenswrapper[4795]: I0310 15:07:25.478412 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.478691 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.482169 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.583086 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.684402 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.786167 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:25 crc kubenswrapper[4795]: E0310 15:07:25.886893 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.005051 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.105883 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.206626 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.307464 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.409158 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.509720 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.609943 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.710977 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: E0310 15:07:26.811790 4795 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.897390 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.915139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.915200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.915223 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.915252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:26 crc kubenswrapper[4795]: I0310 15:07:26.915276 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:26Z","lastTransitionTime":"2026-03-10T15:07:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.018625 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.018664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.018673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.018686 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.018694 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.122135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.122195 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.122216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.122244 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.122262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.225882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.225950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.225968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.225995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.226016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.328839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.328911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.328936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.328967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.328991 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.432480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.432551 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.432571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.432598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.432619 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.470916 4795 apiserver.go:52] "Watching apiserver" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.477369 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.477903 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.478505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.478599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.478709 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.478991 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.479131 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.479170 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.479428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.479663 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.480142 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.483592 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.483924 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.484413 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.484695 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.485012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.485342 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.485648 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.488545 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.488585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.521209 4795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.525690 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.536680 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.536747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.536770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.536800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.536826 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.542427 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.566933 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.583491 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.595567 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.606181 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.615442 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.615905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616378 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616560 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616712 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616994 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617591 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617870 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618178 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617399 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.617821 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618563 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618175 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618470 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618684 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618708 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618804 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618824 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618908 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618933 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619039 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619227 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619300 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619657 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619697 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619730 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619857 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619916 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619979 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620045 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620104 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620138 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620207 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620241 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620337 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620595 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620624 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620653 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620684 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619366 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619390 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619852 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.616746 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.619884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620335 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620753 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620790 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.620774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621032 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621056 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621057 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621469 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621506 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621612 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621681 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621751 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621819 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621906 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621955 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621998 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622111 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622147 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622202 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622595 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623049 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630517 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630811 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630928 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631002 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631095 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631172 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631205 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631306 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631404 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631465 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631497 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631556 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631587 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631679 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631746 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631777 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631839 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.631983 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632045 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632338 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632372 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632440 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632472 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632505 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632535 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632601 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632636 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632670 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632781 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621433 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632956 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633122 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633157 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633225 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633260 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633325 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633358 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633509 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633544 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633647 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633681 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633716 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633784 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633817 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633894 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633927 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633995 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634091 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634303 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634336 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634371 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634408 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634830 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634865 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634970 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635184 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635290 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635561 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635584 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635603 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635622 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635640 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635660 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635679 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635698 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635717 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635736 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635757 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635775 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635791 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635809 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635827 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635845 4795 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635863 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635886 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635903 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635921 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635939 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635960 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621514 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621839 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621858 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621910 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.621949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622305 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622391 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622464 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622487 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622727 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.618565 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.622930 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623397 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.643400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623479 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623642 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.624240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.624308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.624714 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625387 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625516 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625671 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.643607 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.625712 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.626183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.626203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.626344 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.626705 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.626979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.643650 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627425 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627655 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644656 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.628523 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.628533 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.628572 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.629276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.629332 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.629945 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630333 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630644 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630703 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.630718 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.632828 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633184 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.633446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.634855 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635558 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.635630 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644954 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.645006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.645017 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.647949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.636185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.636309 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.636336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.636554 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.637169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.637223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.637231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.637180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.638385 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639266 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639586 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639602 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.639846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.640037 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.640361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.640370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.640406 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.640948 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.649732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641153 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.640911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641234 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.650244 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.650285 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.650508 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.650634 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641623 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641951 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.641162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.642151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.642511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.642293 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.642745 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.642930 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.643495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.627404 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.643732 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644046 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644246 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.623507 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644446 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644561 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.644821 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.645711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.646238 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.646422 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:28.146330689 +0000 UTC m=+81.312071597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.646889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.646920 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.647301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.647183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.647599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.647648 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648005 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648517 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.651208 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:28.151183501 +0000 UTC m=+81.316924459 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648782 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648932 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.648944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.652010 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.652059 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.652147 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.652382 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:28.152344912 +0000 UTC m=+81.318085850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.652381 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.652870 4795 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.653191 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.653459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.653485 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.653614 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.653815 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.655612 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.655751 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:28.155721494 +0000 UTC m=+81.321462462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.654552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.654651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.655249 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.655556 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.655967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.653903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656100 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656342 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656664 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.656865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.657235 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.658457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.658486 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.664302 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.664467 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.664576 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.664812 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.665287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.665635 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.665799 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.665820 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.665883 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:28.165860701 +0000 UTC m=+81.331601679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.665911 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.666202 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.666282 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.666541 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.666892 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.667471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.668372 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.668504 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.669440 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.669613 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.671225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.672264 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.673443 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.673494 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.673845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.673865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.674051 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.674142 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.674366 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.677361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.681682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.683612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.683646 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.691423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.697488 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.701564 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.736867 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.736921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.736992 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737007 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737051 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737062 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737087 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737152 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737165 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737181 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737191 4795 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737201 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737210 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737220 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737230 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737238 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737247 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737257 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737265 4795 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737274 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737284 4795 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737295 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737304 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737316 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737326 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737351 4795 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737362 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737371 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737383 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737395 4795 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737416 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737428 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737437 4795 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737447 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737458 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737470 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737481 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737492 4795 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737500 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737509 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737519 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737528 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737536 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737545 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737553 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737562 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737572 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737580 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737589 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737599 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737608 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737619 4795 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737628 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737637 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737646 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737672 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737680 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737690 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737700 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737711 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737720 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737730 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737740 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737748 4795 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737756 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737766 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737776 4795 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737787 4795 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737796 4795 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737804 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737813 4795 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737821 4795 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737831 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737840 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737850 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737862 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737872 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737880 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737888 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737897 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737905 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737912 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737920 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737928 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737936 4795 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737944 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737952 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737962 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737972 4795 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737982 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.737994 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738003 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738019 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738028 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738037 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738045 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738055 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738067 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738096 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738104 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738112 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738120 4795 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738128 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738137 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738144 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738153 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738161 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738169 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738179 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738187 4795 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738195 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738203 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738211 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738219 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738229 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738238 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738246 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738254 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738263 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738271 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738279 4795 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738288 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738296 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738306 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738315 4795 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738323 4795 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738331 4795 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738339 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738346 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738354 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738363 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738370 4795 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738381 4795 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738392 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738402 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738413 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738423 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738432 4795 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738444 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738454 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738465 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738475 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738486 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738496 4795 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738507 4795 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738516 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738525 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738533 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738541 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738549 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738558 4795 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738568 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738579 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738589 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738601 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738612 4795 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738622 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738633 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738644 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738657 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738668 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738680 4795 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738689 4795 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738697 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738705 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738714 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738740 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738749 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738757 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738765 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738773 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.738782 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.748400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.748431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.748440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.748454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.748463 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.808961 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.828390 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.834769 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:27 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:27 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:27 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:07:27 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 10 15:07:27 crc kubenswrapper[4795]: else Mar 10 15:07:27 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:07:27 crc kubenswrapper[4795]: exit 1 Mar 10 15:07:27 crc kubenswrapper[4795]: fi Mar 10 15:07:27 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:07:27 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:27 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.836042 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.841175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 10 15:07:27 crc kubenswrapper[4795]: W0310 15:07:27.845794 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-1fdc8b42c70b46272784d63b162a1e32814348ec9b49b7670653788b924f7b13 WatchSource:0}: Error finding container 1fdc8b42c70b46272784d63b162a1e32814348ec9b49b7670653788b924f7b13: Status 404 returned error can't find the container with id 1fdc8b42c70b46272784d63b162a1e32814348ec9b49b7670653788b924f7b13 Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.850639 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:27 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:27 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:27 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:27 crc kubenswrapper[4795]: source "/env/_master" Mar 10 15:07:27 crc kubenswrapper[4795]: set +o allexport Mar 10 15:07:27 crc kubenswrapper[4795]: fi Mar 10 15:07:27 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 15:07:27 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 15:07:27 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 10 15:07:27 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 15:07:27 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 15:07:27 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 15:07:27 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:27 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 15:07:27 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 10 15:07:27 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 10 15:07:27 crc kubenswrapper[4795]: ${ho_enable} \ Mar 10 15:07:27 crc kubenswrapper[4795]: --enable-interconnect \ Mar 10 15:07:27 crc kubenswrapper[4795]: --disable-approver \ Mar 10 15:07:27 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 15:07:27 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 10 15:07:27 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 15:07:27 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 10 15:07:27 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:27 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.852307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.852588 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.852668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.852793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.852863 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.852800 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:27 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:27 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:27 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:27 crc kubenswrapper[4795]: source "/env/_master" Mar 10 15:07:27 crc kubenswrapper[4795]: set +o allexport Mar 10 15:07:27 crc kubenswrapper[4795]: fi Mar 10 15:07:27 crc kubenswrapper[4795]: Mar 10 15:07:27 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 15:07:27 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:27 crc kubenswrapper[4795]: --disable-webhook \ Mar 10 15:07:27 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 15:07:27 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 10 15:07:27 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:27 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.854358 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.854502 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 15:07:27 crc kubenswrapper[4795]: E0310 15:07:27.856274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.955767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.955822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.955838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.955861 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:27 crc kubenswrapper[4795]: I0310 15:07:27.955879 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:27Z","lastTransitionTime":"2026-03-10T15:07:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.058777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.058831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.058843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.058864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.058878 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.161318 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.161728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.161876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.162020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.162206 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.243869 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.244176 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:29.244139705 +0000 UTC m=+82.409880643 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.244637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.244871 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.245096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.245131 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.245559 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.245800 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.245162 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:29.244967918 +0000 UTC m=+82.410708856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.246317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.246392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.246592 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.246623 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.247001 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:29.246974922 +0000 UTC m=+82.412715860 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.247192 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.247308 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:29.24727519 +0000 UTC m=+82.413016128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.247499 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.247735 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:29.247716513 +0000 UTC m=+82.413457451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.265048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.265442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.265542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.265633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.265712 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.368364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.368416 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.368428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.368445 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.368458 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.470475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.470514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.470525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.470539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.470548 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.573806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.573881 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.573903 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.573928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.573946 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.675962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.676303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.676553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.676661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.676950 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.780400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.780460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.780479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.780506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.780525 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.827182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c55c4032fc1ada8c9fbfce05a8cfb2fcc3c26428840589793e3b7ea49a7d04a1"} Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.829513 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.829838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1fdc8b42c70b46272784d63b162a1e32814348ec9b49b7670653788b924f7b13"} Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.830694 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.831802 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:28 crc kubenswrapper[4795]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:28 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:28 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:28 crc kubenswrapper[4795]: source "/env/_master" Mar 10 15:07:28 crc kubenswrapper[4795]: set +o allexport Mar 10 15:07:28 crc kubenswrapper[4795]: fi Mar 10 15:07:28 crc kubenswrapper[4795]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 10 15:07:28 crc kubenswrapper[4795]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 10 15:07:28 crc kubenswrapper[4795]: ho_enable="--enable-hybrid-overlay" Mar 10 15:07:28 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 10 15:07:28 crc kubenswrapper[4795]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 10 15:07:28 crc kubenswrapper[4795]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 10 15:07:28 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:28 crc kubenswrapper[4795]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 10 15:07:28 crc kubenswrapper[4795]: --webhook-host=127.0.0.1 \ Mar 10 15:07:28 crc kubenswrapper[4795]: --webhook-port=9743 \ Mar 10 15:07:28 crc kubenswrapper[4795]: ${ho_enable} \ Mar 10 15:07:28 crc kubenswrapper[4795]: --enable-interconnect \ Mar 10 15:07:28 crc kubenswrapper[4795]: --disable-approver \ Mar 10 15:07:28 crc kubenswrapper[4795]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 10 15:07:28 crc kubenswrapper[4795]: --wait-for-kubernetes-api=200s \ Mar 10 15:07:28 crc kubenswrapper[4795]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 10 15:07:28 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 10 15:07:28 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:28 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.833807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f77bdc7222ace238d5bfe57533ad27aef76398b694d9da0a8db76ccc83a6ab62"} Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.834903 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:28 crc kubenswrapper[4795]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 10 15:07:28 crc kubenswrapper[4795]: if [[ -f "/env/_master" ]]; then Mar 10 15:07:28 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:28 crc kubenswrapper[4795]: source "/env/_master" Mar 10 15:07:28 crc kubenswrapper[4795]: set +o allexport Mar 10 15:07:28 crc kubenswrapper[4795]: fi Mar 10 15:07:28 crc kubenswrapper[4795]: Mar 10 15:07:28 crc kubenswrapper[4795]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 10 15:07:28 crc kubenswrapper[4795]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 10 15:07:28 crc kubenswrapper[4795]: --disable-webhook \ Mar 10 15:07:28 crc kubenswrapper[4795]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 10 15:07:28 crc kubenswrapper[4795]: --loglevel="${LOGLEVEL}" Mar 10 15:07:28 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:28 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.836314 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.836671 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:07:28 crc kubenswrapper[4795]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 10 15:07:28 crc kubenswrapper[4795]: set -o allexport Mar 10 15:07:28 crc kubenswrapper[4795]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 10 15:07:28 crc kubenswrapper[4795]: source /etc/kubernetes/apiserver-url.env Mar 10 15:07:28 crc kubenswrapper[4795]: else Mar 10 15:07:28 crc kubenswrapper[4795]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 10 15:07:28 crc kubenswrapper[4795]: exit 1 Mar 10 15:07:28 crc kubenswrapper[4795]: fi Mar 10 15:07:28 crc kubenswrapper[4795]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 10 15:07:28 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 10 15:07:28 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:07:28 crc kubenswrapper[4795]: E0310 15:07:28.837878 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.846977 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.863793 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.879467 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.883433 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.883480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.883499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.883525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.883543 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.895995 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.910172 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.922055 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.935970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.949819 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.964557 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.979699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.986368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.986435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.986451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.986480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.986499 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:28Z","lastTransitionTime":"2026-03-10T15:07:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:28 crc kubenswrapper[4795]: I0310 15:07:28.990015 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.003032 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.089103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.089135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.089146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.089159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.089169 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.192142 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.192226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.192252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.192282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.192302 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.256147 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.256232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.256285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.256316 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.256353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256394 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.256351651 +0000 UTC m=+84.422092609 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256484 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256540 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.256524936 +0000 UTC m=+84.422265854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256555 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256636 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256678 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.256650039 +0000 UTC m=+84.422390967 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256567 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256775 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256802 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256683 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256864 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.256869 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.256849695 +0000 UTC m=+84.422590593 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.257182 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:31.257161023 +0000 UTC m=+84.422901931 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.296491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.296533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.296545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.296567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.296579 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.398782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.398845 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.398863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.398886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.398903 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.475931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.476044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.475931 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.476438 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.476565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:29 crc kubenswrapper[4795]: E0310 15:07:29.476718 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.482516 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.483394 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.485537 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.486535 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.488088 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.489320 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.490123 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.491428 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.492304 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.493438 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.493886 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.494545 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.495026 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.495509 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.495981 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.496491 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.497040 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.497462 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.497974 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.498507 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.498949 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.499479 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.499878 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.500509 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.500958 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501177 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501232 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501534 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.501804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.502136 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.505332 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.505836 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.506598 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.507018 4795 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.507132 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.508976 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.509561 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.510013 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.511403 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.512342 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.512849 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.513765 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.514384 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.515200 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.515767 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.516961 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.517812 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.519404 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.520909 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.522930 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.524892 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.526849 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.528049 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.529419 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.530004 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.530555 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.531357 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.603858 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.604093 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.604227 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.604343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.604450 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.707419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.707465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.707481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.707503 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.707519 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.811051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.811155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.811258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.811286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.811307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.914984 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.915415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.915604 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.915795 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:29 crc kubenswrapper[4795]: I0310 15:07:29.916001 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:29Z","lastTransitionTime":"2026-03-10T15:07:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.019051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.019381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.019465 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.019558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.019709 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.122591 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.122628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.122637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.122653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.122662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.225217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.225260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.225274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.225293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.225307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.327572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.327620 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.327631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.327649 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.327661 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.429758 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.429807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.429823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.429842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.429857 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.533022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.533125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.533144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.533185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.533224 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.636449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.636538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.636567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.636599 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.636622 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.739442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.739495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.739506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.739518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.739527 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.842037 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.842146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.842166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.842203 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.842237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.945152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.945190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.945199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.945215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:30 crc kubenswrapper[4795]: I0310 15:07:30.945224 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:30Z","lastTransitionTime":"2026-03-10T15:07:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.047372 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.047412 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.047424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.047452 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.047465 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.150214 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.150249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.150257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.150271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.150280 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.253098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.253172 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.253196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.253225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.253248 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.278682 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.278780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.278828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.278868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.278890 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:35.27885905 +0000 UTC m=+88.444599948 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.278951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279017 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279022 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279043 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279061 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279095 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279124 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:35.279110546 +0000 UTC m=+88.444851454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279150 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:35.279133947 +0000 UTC m=+88.444874855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279199 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:35.279170748 +0000 UTC m=+88.444911676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279242 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279312 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279333 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.279453 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:35.279428795 +0000 UTC m=+88.445169723 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.355467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.355713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.355779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.355852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.355919 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.458764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.458834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.458855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.458883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.458902 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.475450 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.475625 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.475714 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.475778 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.476046 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:31 crc kubenswrapper[4795]: E0310 15:07:31.476131 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.561751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.561815 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.561838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.561866 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.561889 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.664353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.664468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.664496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.664527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.664550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.767732 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.767788 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.767808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.767830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.767850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.870773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.870822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.870841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.870864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.870882 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.973216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.973254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.973266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.973282 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:31 crc kubenswrapper[4795]: I0310 15:07:31.973294 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:31Z","lastTransitionTime":"2026-03-10T15:07:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.075769 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.075848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.075872 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.075904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.075927 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.179062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.179170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.179199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.179235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.179262 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.281249 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.281327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.281352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.281381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.281403 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.384219 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.384280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.384291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.384309 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.384321 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.486927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.486968 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.486981 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.486996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.487006 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.589565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.589600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.589609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.589622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.589631 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.692327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.692365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.692377 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.692392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.692401 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.795380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.795511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.795537 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.795566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.795587 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.897523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.897841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.898044 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.898419 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:32 crc kubenswrapper[4795]: I0310 15:07:32.898714 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:32Z","lastTransitionTime":"2026-03-10T15:07:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.002058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.002468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.002697 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.002878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.003065 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.106654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.106738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.106763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.106796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.106819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.209921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.209974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.209994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.210020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.210040 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.313027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.313132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.313151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.313173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.313190 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.416338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.416385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.416395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.416413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.416424 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.475628 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.475695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:33 crc kubenswrapper[4795]: E0310 15:07:33.475790 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:33 crc kubenswrapper[4795]: E0310 15:07:33.475918 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.475641 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:33 crc kubenswrapper[4795]: E0310 15:07:33.476035 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.519530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.519617 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.519635 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.519656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.519674 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.623413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.623485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.623502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.623529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.623547 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.726543 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.726941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.727112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.727254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.727372 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.830488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.830545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.830557 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.830574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.830585 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.933103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.933136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.933146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.933167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:33 crc kubenswrapper[4795]: I0310 15:07:33.933180 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:33Z","lastTransitionTime":"2026-03-10T15:07:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.036776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.036834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.036856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.036884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.036907 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.139651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.139710 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.139728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.139750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.139770 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.242764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.242833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.242852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.242878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.242899 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.345867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.345950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.345976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.346004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.346026 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.448181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.448250 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.448272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.448302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.448325 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.551919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.551979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.551996 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.552020 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.552037 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.655209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.655291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.655314 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.655344 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.655365 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.759036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.759121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.759144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.759174 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.759196 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.778002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.778095 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.778115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.778143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.778163 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.795028 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.799434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.799493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.799512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.799536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.799554 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.815542 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.820193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.820267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.820297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.820328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.820350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.836343 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.840667 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.840750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.840775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.840802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.840820 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.861689 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.865806 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.865874 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.865898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.865929 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.865953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.880960 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:34 crc kubenswrapper[4795]: E0310 15:07:34.881222 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.883116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.883183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.883207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.883238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.883264 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.986280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.986330 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.986340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.986355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:34 crc kubenswrapper[4795]: I0310 15:07:34.986364 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:34Z","lastTransitionTime":"2026-03-10T15:07:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.088745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.088779 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.088789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.088807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.088819 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.191415 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.191457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.191468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.191485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.191496 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.293569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.293612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.293624 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.293643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.293655 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.323530 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.323630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323728 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:43.323689762 +0000 UTC m=+96.489430690 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323749 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323770 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323783 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323830 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:43.323812416 +0000 UTC m=+96.489553314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.323826 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.323907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.323949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.323972 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324015 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:43.324003981 +0000 UTC m=+96.489744969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324046 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324134 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:43.324119844 +0000 UTC m=+96.489860782 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324161 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324200 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324219 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.324337 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:43.324301949 +0000 UTC m=+96.490042857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.397059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.397220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.397239 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.397262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.397279 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.476284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.476351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.476405 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.476458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.476558 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:35 crc kubenswrapper[4795]: E0310 15:07:35.476622 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.488896 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.499643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.499696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.499711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.499730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.499746 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.602441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.602484 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.602492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.602506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.602516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.705105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.705154 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.705166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.705185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.705197 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.807437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.807490 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.807507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.807529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.807544 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.910889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.910960 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.910979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.911002 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:35 crc kubenswrapper[4795]: I0310 15:07:35.911019 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:35Z","lastTransitionTime":"2026-03-10T15:07:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.013789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.013848 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.013860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.013879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.013892 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.116326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.116381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.116393 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.116413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.116427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.219422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.219464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.219474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.219488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.219497 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.321089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.321160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.321173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.321185 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.321193 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.422545 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.422575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.422583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.422595 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.422604 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.525119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.525156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.525167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.525183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.525195 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.628256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.628327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.628351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.628376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.628399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.730999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.731049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.731062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.731098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.731111 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.833749 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.833807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.833821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.833838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.833850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.935738 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.935855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.935878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.935909 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:36 crc kubenswrapper[4795]: I0310 15:07:36.935932 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:36Z","lastTransitionTime":"2026-03-10T15:07:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.038052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.038123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.038140 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.038160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.038174 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.140741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.140782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.140791 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.140807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.140816 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.243063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.243169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.243201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.243225 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.243242 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.313939 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.346048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.346109 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.346121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.346139 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.346150 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.449339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.449418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.449440 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.449471 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.449495 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.476412 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.476494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:37 crc kubenswrapper[4795]: E0310 15:07:37.476589 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:37 crc kubenswrapper[4795]: E0310 15:07:37.476683 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.476819 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:37 crc kubenswrapper[4795]: E0310 15:07:37.476966 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.494997 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.506691 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.513671 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.522244 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.532623 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.541151 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551743 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.551874 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.653426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.653737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.653747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.653763 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.653772 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.756493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.756542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.756553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.756575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.756586 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.859838 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.859912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.859935 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.859963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.859988 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.962273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.962351 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.962367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.962390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:37 crc kubenswrapper[4795]: I0310 15:07:37.962444 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:37Z","lastTransitionTime":"2026-03-10T15:07:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.065003 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.065063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.065336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.065368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.065390 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.167850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.167915 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.167936 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.167963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.167986 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.271432 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.271497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.271515 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.271540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.271556 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.374327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.374385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.374407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.374438 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.374456 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.476727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.476789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.476807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.476832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.476851 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.579759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.579792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.579802 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.579814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.579823 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.682558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.682602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.682613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.682629 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.682640 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.709598 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.784483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.784538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.784553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.784573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.784588 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.886540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.886586 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.886600 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.886616 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.886626 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.988972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.989038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.989057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.989113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:38 crc kubenswrapper[4795]: I0310 15:07:38.989136 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:38Z","lastTransitionTime":"2026-03-10T15:07:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.091208 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.091272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.091287 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.091310 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.091326 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.193724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.193770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.193786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.193809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.193827 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.295422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.295508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.295529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.295555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.295574 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.398454 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.398495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.398504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.398538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.398548 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.476499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:39 crc kubenswrapper[4795]: E0310 15:07:39.476729 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.476753 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.477945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:39 crc kubenswrapper[4795]: E0310 15:07:39.481970 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:39 crc kubenswrapper[4795]: E0310 15:07:39.482229 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.491482 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.491879 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:39 crc kubenswrapper[4795]: E0310 15:07:39.492028 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.500470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.500514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.500527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.500544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.500558 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.603348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.603421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.603443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.603475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.603497 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.706256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.706361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.706385 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.706409 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.706427 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.808641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.808676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.808687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.808702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.808714 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.867356 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:39 crc kubenswrapper[4795]: E0310 15:07:39.867626 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.911220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.911302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.911329 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.911362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:39 crc kubenswrapper[4795]: I0310 15:07:39.911389 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:39Z","lastTransitionTime":"2026-03-10T15:07:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.014245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.014328 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.014352 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.014380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.014405 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.116955 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.116993 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.117005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.117021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.117032 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.219847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.219899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.219911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.219927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.219940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.321888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.321953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.321971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.321998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.322022 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.425721 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.425785 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.425804 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.425829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.425848 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.528746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.528805 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.528821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.528839 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.528852 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.630970 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.631001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.631009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.631024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.631033 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.734144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.734212 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.734234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.734289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.734311 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.837007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.837045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.837054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.837086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.837096 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.870149 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.878636 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.888418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.898238 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.911523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.923538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.938317 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.945959 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.946011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.946027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.946048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.946084 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:40Z","lastTransitionTime":"2026-03-10T15:07:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.957221 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:40 crc kubenswrapper[4795]: I0310 15:07:40.967486 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.048628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.048653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.048661 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.048673 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.048682 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.151270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.151339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.151356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.151381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.151399 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.254178 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.254255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.254274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.254305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.254325 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.355997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.356054 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.356104 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.356136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.356159 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.458141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.458186 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.458201 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.458222 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.458236 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.476467 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.476525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.476533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:41 crc kubenswrapper[4795]: E0310 15:07:41.477023 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:41 crc kubenswrapper[4795]: E0310 15:07:41.477115 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:41 crc kubenswrapper[4795]: E0310 15:07:41.477342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.560997 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.561033 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.561041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.561055 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.561079 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.664388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.664429 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.664441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.664457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.664469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.766818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.766862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.766875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.766897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.766911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.869751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.869796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.869811 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.869825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.869835 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.875952 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.876005 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53"} Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.887153 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.901745 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.914393 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.926063 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.935771 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.949037 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.959832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.967840 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.972615 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.972655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.972663 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.972677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:41 crc kubenswrapper[4795]: I0310 15:07:41.972686 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:41Z","lastTransitionTime":"2026-03-10T15:07:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.074841 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.075086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.075171 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.075260 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.075344 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.178011 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.178342 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.178434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.178521 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.178599 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.280974 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.281022 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.281039 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.281062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.281109 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.382865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.382899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.382906 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.382921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.382930 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.485668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.485733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.485752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.485775 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.485794 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.588641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.588699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.588720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.588745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.588764 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.692196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.692256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.692273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.692298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.692318 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.794502 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.794542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.794556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.794575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.794585 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.897413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.898567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.898755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.898877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:42 crc kubenswrapper[4795]: I0310 15:07:42.898989 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:42Z","lastTransitionTime":"2026-03-10T15:07:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.001389 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.001451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.001462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.001480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.001494 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.103916 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.104331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.104462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.104583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.104793 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.208156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.208508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.208727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.208933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.209162 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.312662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.312911 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.313027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.313157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.313239 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.398362 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.398469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.398522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.398563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.398599 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.398764 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.398792 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.398811 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.398880 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:59.398858183 +0000 UTC m=+112.564599122 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399091 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399251 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399334 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:59.399310486 +0000 UTC m=+112.565051424 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399267 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399372 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399415 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:59.399402088 +0000 UTC m=+112.565143026 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399168 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399465 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:07:59.39945248 +0000 UTC m=+112.565193418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.399534 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:07:59.399519802 +0000 UTC m=+112.565260740 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.417869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.417963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.417982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.418038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.418058 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.475831 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.476034 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.476391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.476635 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.476523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:43 crc kubenswrapper[4795]: E0310 15:07:43.476834 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.520026 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.520217 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.520341 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.520407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.520461 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.622353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.622383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.622391 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.622404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.622413 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.724463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.724723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.724809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.724887 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.724945 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.827755 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.828014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.828119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.828216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.828306 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.930933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.930973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.930982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.930995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:43 crc kubenswrapper[4795]: I0310 15:07:43.931003 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:43Z","lastTransitionTime":"2026-03-10T15:07:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.032992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.033064 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.033119 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.033147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.033170 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.135992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.136063 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.136113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.136155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.136177 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.238395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.238457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.238480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.238510 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.238533 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.340961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.341057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.341112 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.341143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.341166 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.443447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.443509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.443527 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.443550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.443567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.546152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.546257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.546275 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.546300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.546318 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.649610 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.649653 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.649664 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.649683 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.649694 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.752494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.752561 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.752583 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.752611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.752631 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.855491 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.855580 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.855598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.855627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.855645 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.957376 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.957420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.957431 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.957448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:44 crc kubenswrapper[4795]: I0310 15:07:44.957458 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:44Z","lastTransitionTime":"2026-03-10T15:07:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.058912 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.058940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.058950 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.058964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.058972 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.160668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.160723 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.160743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.160767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.160785 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.216258 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.216321 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.216338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.216361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.216377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.237180 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.245800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.245859 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.245908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.245942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.245961 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.271221 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.275940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.276004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.276021 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.276045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.276062 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.295540 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.300386 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.300448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.300464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.300488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.300504 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.320314 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.324944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.325006 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.325029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.325050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.325094 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.372760 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.373007 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.380794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.380842 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.380860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.380886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.380904 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.478396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.478513 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.478575 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.478629 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.478671 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:45 crc kubenswrapper[4795]: E0310 15:07:45.478718 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.482748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.482787 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.482798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.482812 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.482823 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.585986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.586042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.586060 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.586108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.586156 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.689656 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.689705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.689722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.689745 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.689762 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.769658 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.792408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.792476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.792492 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.792519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.792538 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.886618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.895280 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.895338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.895355 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.895748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.895804 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.911030 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.931951 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.943325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.962274 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.980041 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.996033 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:45Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.998057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.998146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.998166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.998194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:45 crc kubenswrapper[4795]: I0310 15:07:45.998216 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:45Z","lastTransitionTime":"2026-03-10T15:07:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.044888 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.059498 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.100924 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.100964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.100975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.100989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.100999 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.204713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.205040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.205058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.205111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.205129 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.307450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.307497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.307516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.307540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.307557 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.410875 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.410995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.411017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.411041 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.411061 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.496091 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.514539 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.514598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.514619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.514648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.514668 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.618121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.618167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.618179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.618196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.618208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.720991 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.721094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.721124 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.721156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.721181 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.823388 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.823436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.823449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.823468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.823483 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.926361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.926408 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.926424 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.926446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:46 crc kubenswrapper[4795]: I0310 15:07:46.926460 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:46Z","lastTransitionTime":"2026-03-10T15:07:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.028603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.028634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.028643 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.028655 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.028664 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.131237 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.131283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.131319 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.131336 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.131347 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.234675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.234747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.234766 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.234794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.234828 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.338291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.338350 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.338363 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.338383 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.338398 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.441447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.441517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.441541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.441569 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.441590 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.476190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.476252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:47 crc kubenswrapper[4795]: E0310 15:07:47.476332 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.476191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:47 crc kubenswrapper[4795]: E0310 15:07:47.476449 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:47 crc kubenswrapper[4795]: E0310 15:07:47.476503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.490505 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.505451 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.517995 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.535617 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.544036 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.544103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.544123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.544167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.544183 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.569291 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.591710 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.612528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.630696 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647652 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647751 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.647897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.751467 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.751536 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.751560 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.751593 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.751618 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.826159 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-hmw5g"] Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.828247 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.831181 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.831446 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.831580 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.845314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.854552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.854862 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.854880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.854898 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.854911 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.870484 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.885422 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.896914 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.919938 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.930776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.942352 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.956835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.956867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.956876 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.956890 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.956899 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:47Z","lastTransitionTime":"2026-03-10T15:07:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.957712 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.961222 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46vtj\" (UniqueName: \"kubernetes.io/projected/729b6d95-56e9-4944-9397-28161f39fda6-kube-api-access-46vtj\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.961301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/729b6d95-56e9-4944-9397-28161f39fda6-hosts-file\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.969734 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:47 crc kubenswrapper[4795]: I0310 15:07:47.980375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.060926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.061957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.063132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.063400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.063585 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.062330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46vtj\" (UniqueName: \"kubernetes.io/projected/729b6d95-56e9-4944-9397-28161f39fda6-kube-api-access-46vtj\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.063887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/729b6d95-56e9-4944-9397-28161f39fda6-hosts-file\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.064004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/729b6d95-56e9-4944-9397-28161f39fda6-hosts-file\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.100775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46vtj\" (UniqueName: \"kubernetes.io/projected/729b6d95-56e9-4944-9397-28161f39fda6-kube-api-access-46vtj\") pod \"node-resolver-hmw5g\" (UID: \"729b6d95-56e9-4944-9397-28161f39fda6\") " pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.148933 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hmw5g" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.168423 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.168493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.168512 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.168540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.168558 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.194510 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-747vh"] Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.195531 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.196997 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-v49r8"] Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.198753 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tn44z"] Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.199526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.200904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.202415 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.202671 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.202928 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.203284 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.203557 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.204037 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.204323 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.204568 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.204477 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.204960 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.206608 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.211390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.228700 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.250477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.264407 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.273468 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.273507 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.273519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.273538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.273551 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.276415 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.291398 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.301045 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.326009 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.339510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.354722 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368099 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-multus-certs\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/92ceb516-b88c-44bd-b534-25ea21b31379-rootfs\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368655 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnsf\" (UniqueName: \"kubernetes.io/projected/589b366f-9132-43cc-8d7a-d401d396bf06-kube-api-access-nbnsf\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-hostroot\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-conf-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368894 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f6sn\" (UniqueName: \"kubernetes.io/projected/92ceb516-b88c-44bd-b534-25ea21b31379-kube-api-access-9f6sn\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.368968 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-cnibin\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.369048 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-system-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.369448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-bin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.369677 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-etc-kubernetes\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.369853 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-cni-binary-copy\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370452 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92ceb516-b88c-44bd-b534-25ea21b31379-proxy-tls\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-system-cni-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370608 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-multus\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370679 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-multus-daemon-config\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370866 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-os-release\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.370991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr57s\" (UniqueName: \"kubernetes.io/projected/ec2edb56-6323-4261-8954-3e75a645ed42-kube-api-access-mr57s\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-os-release\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-k8s-cni-cncf-io\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-netns\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92ceb516-b88c-44bd-b534-25ea21b31379-mcd-auth-proxy-config\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-kubelet\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371503 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-cnibin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.371533 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-socket-dir-parent\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.376266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.376297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.376311 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.376331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.376346 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.382528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.396829 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.408270 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.429365 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.444585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.461641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472542 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-cnibin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-socket-dir-parent\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-multus-certs\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/92ceb516-b88c-44bd-b534-25ea21b31379-rootfs\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-cnibin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472736 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnsf\" (UniqueName: \"kubernetes.io/projected/589b366f-9132-43cc-8d7a-d401d396bf06-kube-api-access-nbnsf\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-hostroot\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-conf-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f6sn\" (UniqueName: \"kubernetes.io/projected/92ceb516-b88c-44bd-b534-25ea21b31379-kube-api-access-9f6sn\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-cnibin\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-system-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-bin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.472985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-cni-binary-copy\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473007 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-etc-kubernetes\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92ceb516-b88c-44bd-b534-25ea21b31379-proxy-tls\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473095 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-system-cni-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473121 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-multus\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473172 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-multus-daemon-config\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-os-release\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr57s\" (UniqueName: \"kubernetes.io/projected/ec2edb56-6323-4261-8954-3e75a645ed42-kube-api-access-mr57s\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-os-release\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-k8s-cni-cncf-io\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473360 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-netns\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92ceb516-b88c-44bd-b534-25ea21b31379-mcd-auth-proxy-config\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-socket-dir-parent\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-kubelet\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473473 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-hostroot\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-conf-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-etc-kubernetes\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473603 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-multus-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473708 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-os-release\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/92ceb516-b88c-44bd-b534-25ea21b31379-rootfs\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-multus-certs\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473797 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-netns\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-os-release\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473891 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-run-k8s-cni-cncf-io\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-kubelet\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-multus\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.473963 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-system-cni-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.474404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-cnibin\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.474941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475024 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/92ceb516-b88c-44bd-b534-25ea21b31379-mcd-auth-proxy-config\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475100 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ec2edb56-6323-4261-8954-3e75a645ed42-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-system-cni-dir\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-multus-daemon-config\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475538 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/589b366f-9132-43cc-8d7a-d401d396bf06-cni-binary-copy\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.475848 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.476030 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ec2edb56-6323-4261-8954-3e75a645ed42-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.477275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/589b366f-9132-43cc-8d7a-d401d396bf06-host-var-lib-cni-bin\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479074 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479123 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479132 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.479786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/92ceb516-b88c-44bd-b534-25ea21b31379-proxy-tls\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.490909 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f6sn\" (UniqueName: \"kubernetes.io/projected/92ceb516-b88c-44bd-b534-25ea21b31379-kube-api-access-9f6sn\") pod \"machine-config-daemon-747vh\" (UID: \"92ceb516-b88c-44bd-b534-25ea21b31379\") " pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.492906 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnsf\" (UniqueName: \"kubernetes.io/projected/589b366f-9132-43cc-8d7a-d401d396bf06-kube-api-access-nbnsf\") pod \"multus-v49r8\" (UID: \"589b366f-9132-43cc-8d7a-d401d396bf06\") " pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.495879 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr57s\" (UniqueName: \"kubernetes.io/projected/ec2edb56-6323-4261-8954-3e75a645ed42-kube-api-access-mr57s\") pod \"multus-additional-cni-plugins-tn44z\" (UID: \"ec2edb56-6323-4261-8954-3e75a645ed42\") " pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.496554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.508987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.524549 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.536990 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.538177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.551105 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.553235 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v49r8" Mar 10 15:07:48 crc kubenswrapper[4795]: W0310 15:07:48.553996 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ceb516_b88c_44bd_b534_25ea21b31379.slice/crio-e6641faddbd1aeff54a40c0b452b029073a85bf4e6d757e95ea1c2d254c80e10 WatchSource:0}: Error finding container e6641faddbd1aeff54a40c0b452b029073a85bf4e6d757e95ea1c2d254c80e10: Status 404 returned error can't find the container with id e6641faddbd1aeff54a40c0b452b029073a85bf4e6d757e95ea1c2d254c80e10 Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.564580 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.565306 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tn44z" Mar 10 15:07:48 crc kubenswrapper[4795]: W0310 15:07:48.566409 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod589b366f_9132_43cc_8d7a_d401d396bf06.slice/crio-6cbdcb7895503bbe2743aa2a5222bcec90dd7d6363f8e19ce30a5beda83fc6d7 WatchSource:0}: Error finding container 6cbdcb7895503bbe2743aa2a5222bcec90dd7d6363f8e19ce30a5beda83fc6d7: Status 404 returned error can't find the container with id 6cbdcb7895503bbe2743aa2a5222bcec90dd7d6363f8e19ce30a5beda83fc6d7 Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.578858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.581038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.581086 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.581098 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.581115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.581127 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.584788 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4q8gk"] Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.586881 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.590655 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591153 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591221 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591312 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591316 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591443 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.591486 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.611152 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: W0310 15:07:48.667845 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec2edb56_6323_4261_8954_3e75a645ed42.slice/crio-d5caa3826d9f7c88e2cb29cdaf987396fbf287dfeea6c9e6d382d4dc6b1c0752 WatchSource:0}: Error finding container d5caa3826d9f7c88e2cb29cdaf987396fbf287dfeea6c9e6d382d4dc6b1c0752: Status 404 returned error can't find the container with id d5caa3826d9f7c88e2cb29cdaf987396fbf287dfeea6c9e6d382d4dc6b1c0752 Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.677199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.677446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.677748 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678219 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678699 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.678988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679267 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r92zs\" (UniqueName: \"kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.679964 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.680002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.681970 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.683137 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.683295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.683459 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.683613 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.683763 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.703835 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.721903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.732761 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.748816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.759882 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.772762 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780670 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780792 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.780861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781525 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781673 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781743 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781772 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781786 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781803 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781823 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781898 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r92zs\" (UniqueName: \"kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781947 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.781969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.782277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.785125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.787737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.793863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.793902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.793913 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.793928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.793940 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.795906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.802296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r92zs\" (UniqueName: \"kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs\") pod \"ovnkube-node-4q8gk\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.807890 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.819304 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.831056 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.841710 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.895320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.895353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.895364 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.895380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.895391 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.896018 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.896057 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.896084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"e6641faddbd1aeff54a40c0b452b029073a85bf4e6d757e95ea1c2d254c80e10"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.897207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerStarted","Data":"35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.897257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerStarted","Data":"6cbdcb7895503bbe2743aa2a5222bcec90dd7d6363f8e19ce30a5beda83fc6d7"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.898969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerStarted","Data":"76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.899000 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerStarted","Data":"d5caa3826d9f7c88e2cb29cdaf987396fbf287dfeea6c9e6d382d4dc6b1c0752"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.899846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hmw5g" event={"ID":"729b6d95-56e9-4944-9397-28161f39fda6","Type":"ContainerStarted","Data":"f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.899872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hmw5g" event={"ID":"729b6d95-56e9-4944-9397-28161f39fda6","Type":"ContainerStarted","Data":"4782e251fdaf7ab0d7ae412129cda8fbf95e28439c24c875b1221047c1c559ad"} Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.909875 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.912266 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.922506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: W0310 15:07:48.923749 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b0616b_9d8b_43a5_b8c7_d9cbb4669583.slice/crio-190bf2554acd11f746ecab58421da3e6bccd92617f9c2a94668b4caf99de24e0 WatchSource:0}: Error finding container 190bf2554acd11f746ecab58421da3e6bccd92617f9c2a94668b4caf99de24e0: Status 404 returned error can't find the container with id 190bf2554acd11f746ecab58421da3e6bccd92617f9c2a94668b4caf99de24e0 Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.943725 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.961091 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.971763 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.984939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.995410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.998894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.998927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.998940 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.998957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:48 crc kubenswrapper[4795]: I0310 15:07:48.998969 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:48Z","lastTransitionTime":"2026-03-10T15:07:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.009303 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.020985 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.032660 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.041773 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.055821 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.068627 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.080692 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.090839 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102479 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102538 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102547 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.102684 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.114235 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.126004 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.138554 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.150329 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.161585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.184248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.204259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.204283 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.204291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.204304 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.204314 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.225476 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.265641 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.306575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.306611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.306621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.306637 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.306649 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.317770 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.351816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.384769 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.411335 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.411421 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.411447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.411477 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.411515 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.431008 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.476304 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:49 crc kubenswrapper[4795]: E0310 15:07:49.476429 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.476742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:49 crc kubenswrapper[4795]: E0310 15:07:49.476789 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.476912 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:49 crc kubenswrapper[4795]: E0310 15:07:49.476958 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.514711 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.515138 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.515273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.515399 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.515503 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.618034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.618100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.618113 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.618131 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.618142 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.721425 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.721638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.721654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.721675 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.721689 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.824390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.824426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.824434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.824446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.824454 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.903539 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" exitCode=0 Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.903616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.903656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"190bf2554acd11f746ecab58421da3e6bccd92617f9c2a94668b4caf99de24e0"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.904947 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f" exitCode=0 Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.905024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.929622 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.931474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.931511 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.931525 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.931547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.931563 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:49Z","lastTransitionTime":"2026-03-10T15:07:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.942029 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.955328 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.966867 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.977914 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.986867 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:49 crc kubenswrapper[4795]: I0310 15:07:49.999229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:49Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.011699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.030276 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.034457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.034516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.034526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.034541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.034550 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.060016 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.071817 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.086994 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.098697 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.112939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.130028 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.136365 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.136427 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.136443 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.136463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.136478 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.138730 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.180751 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.197007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.214820 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.231135 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.239413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.239442 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.239450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.239462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.239471 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.271260 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.309860 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.342573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.344474 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.344493 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.344518 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.344534 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.344620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.387783 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.428351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.447326 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.447360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.447370 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.447384 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.447394 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.477034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.507890 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.549496 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.549533 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.549542 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.549556 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.549567 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.556350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.651475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.651531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.651540 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.651552 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.651561 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.754690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.754735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.754747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.754780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.754801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.857907 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.857953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.857962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.857979 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.857990 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.910649 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43" exitCode=0 Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.910719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.915716 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916420 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65" exitCode=1 Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916466 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.916561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.928784 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.950862 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.960108 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.960157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.960175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.960194 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.960209 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:50Z","lastTransitionTime":"2026-03-10T15:07:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.965399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.978508 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:50 crc kubenswrapper[4795]: I0310 15:07:50.991733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:50Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.004243 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.018998 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.032109 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.043542 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.056103 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.062204 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.062241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.062254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.062270 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.062283 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.078527 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.095411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.105441 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.119531 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.165696 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.165735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.165747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.165764 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.165775 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.268888 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.268931 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.268946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.268969 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.268985 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.371669 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.371712 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.371727 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.371744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.371755 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.473554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.473585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.473597 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.473611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.473622 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.476252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.476294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.476264 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:51 crc kubenswrapper[4795]: E0310 15:07:51.476383 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:51 crc kubenswrapper[4795]: E0310 15:07:51.476480 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:51 crc kubenswrapper[4795]: E0310 15:07:51.476552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.579398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.579458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.579469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.579485 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.579500 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.681846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.681891 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.681904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.681922 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.681937 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.784865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.784913 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.784925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.784943 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.784955 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.887798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.887849 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.887863 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.887885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.887901 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.920817 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e" exitCode=0 Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.920863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.941124 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.954943 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.969745 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.982802 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.990567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.990603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.990614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.990633 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.990648 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:51Z","lastTransitionTime":"2026-03-10T15:07:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:51 crc kubenswrapper[4795]: I0310 15:07:51.997910 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:51Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.016604 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.028257 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.041397 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.053349 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.072310 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.082972 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.092965 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.093784 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.093819 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.093830 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.093847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.093859 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.110319 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.128522 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.196359 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.196404 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.196428 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.196448 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.196457 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.299264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.299358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.299406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.299430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.299476 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.401246 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.401274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.401284 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.401298 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.401308 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.503581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.503614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.503622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.503634 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.503642 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.606241 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.606278 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.606289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.606303 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.606312 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.708116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.708167 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.708183 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.708205 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.708220 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.810941 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.811004 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.811023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.811050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.811095 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.913001 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.913101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.913132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.913165 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.913190 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:52Z","lastTransitionTime":"2026-03-10T15:07:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.927061 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78" exitCode=0 Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.927134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.932295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.933816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.948969 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.969902 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:52 crc kubenswrapper[4795]: I0310 15:07:52.992234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:52Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.013717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.016572 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.016650 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.016672 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.016699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.016722 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.032731 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.056570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.084378 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.100404 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.115659 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.121407 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.121453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.121464 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.121481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.121491 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.137965 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.157876 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.168340 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.179157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.186849 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.224563 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.224618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.224638 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.224662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.224715 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.328155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.328215 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.328273 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.328300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.328318 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.430592 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.430636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.430646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.430660 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.430669 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.476268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.476376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:53 crc kubenswrapper[4795]: E0310 15:07:53.476459 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.476501 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:53 crc kubenswrapper[4795]: E0310 15:07:53.476704 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:53 crc kubenswrapper[4795]: E0310 15:07:53.476874 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.532776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.532822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.532833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.532850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.532864 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.635340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.635400 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.635420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.635447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.635469 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.738780 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.738847 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.738865 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.738889 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.738933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.842035 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.842135 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.842157 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.842184 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.842204 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.941377 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2" exitCode=0 Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.941460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.945748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.945821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.945846 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.945877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.945898 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:53Z","lastTransitionTime":"2026-03-10T15:07:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.966519 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:53 crc kubenswrapper[4795]: I0310 15:07:53.990471 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:53Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.012495 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.037751 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.048965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.049000 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.049009 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.049057 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.049081 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.050724 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.070007 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.082996 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.105921 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.120726 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.137872 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.151034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.151450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.151462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.151476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.151486 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.153132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.164463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.178211 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.192266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.254371 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.254420 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.254436 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.254455 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.254468 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.357666 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.357730 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.357752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.357782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.357807 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.468379 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.468449 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.468475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.468506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.468531 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.476390 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.572007 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.572052 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.572083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.572100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.572114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.626893 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-jc8ps"] Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.627362 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.629856 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.629922 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.631538 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.631741 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.654715 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.667620 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.674747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.674798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.674813 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.674835 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.674897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.692174 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.705469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.722376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.739984 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.742740 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/859bda50-59de-4471-a2d1-785d7b1d06d7-host\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.742884 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/859bda50-59de-4471-a2d1-785d7b1d06d7-serviceca\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.742995 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6vf9\" (UniqueName: \"kubernetes.io/projected/859bda50-59de-4471-a2d1-785d7b1d06d7-kube-api-access-k6vf9\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.753377 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.768196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.778005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.778056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.778100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.778127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.778144 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.791410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.813822 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.826656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.838773 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.843907 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/859bda50-59de-4471-a2d1-785d7b1d06d7-serviceca\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.843966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6vf9\" (UniqueName: \"kubernetes.io/projected/859bda50-59de-4471-a2d1-785d7b1d06d7-kube-api-access-k6vf9\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.844000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/859bda50-59de-4471-a2d1-785d7b1d06d7-host\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.844122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/859bda50-59de-4471-a2d1-785d7b1d06d7-host\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.845080 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/859bda50-59de-4471-a2d1-785d7b1d06d7-serviceca\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.849781 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.866101 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.870601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6vf9\" (UniqueName: \"kubernetes.io/projected/859bda50-59de-4471-a2d1-785d7b1d06d7-kube-api-access-k6vf9\") pod \"node-ca-jc8ps\" (UID: \"859bda50-59de-4471-a2d1-785d7b1d06d7\") " pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.881173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.881354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.881437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.881509 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.881569 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.884451 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.949397 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec2edb56-6323-4261-8954-3e75a645ed42" containerID="ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954" exitCode=0 Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.949598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerDied","Data":"ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.955327 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-jc8ps" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.956314 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.963378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.963741 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.963981 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:07:54 crc kubenswrapper[4795]: W0310 15:07:54.978000 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod859bda50_59de_4471_a2d1_785d7b1d06d7.slice/crio-91c0b6520d115424fa12c4a708b27df015e75ca83c07f719ea4d1e79721675b3 WatchSource:0}: Error finding container 91c0b6520d115424fa12c4a708b27df015e75ca83c07f719ea4d1e79721675b3: Status 404 returned error can't find the container with id 91c0b6520d115424fa12c4a708b27df015e75ca83c07f719ea4d1e79721675b3 Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.982548 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.987756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.987794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.987809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.987828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.987837 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:54Z","lastTransitionTime":"2026-03-10T15:07:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:54 crc kubenswrapper[4795]: I0310 15:07:54.997509 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:54Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.011967 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.029401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.041737 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.059034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.079617 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.092571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.092621 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.092631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.092651 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.092662 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.100631 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.112176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.123510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.146383 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.167062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.189237 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.195750 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.195809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.195818 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.195837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.195849 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.204250 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.220172 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.235613 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.244280 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.261141 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.275927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.288698 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.299734 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.299786 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.299800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.299817 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.299828 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.306826 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.320895 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.330848 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.343527 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.355513 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.376572 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.397787 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.402796 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.402852 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.402869 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.402893 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.402912 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.415464 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.432848 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.476007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.476100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.476178 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.476010 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.476245 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.476323 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.505828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.505871 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.505883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.505901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.505913 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.535187 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.535257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.535274 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.535300 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.535317 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.554741 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.559581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.559611 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.559622 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.559636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.559646 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.576904 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.581636 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.581701 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.581722 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.581746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.581766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.597852 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.602618 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.602681 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.602705 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.602737 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.602758 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.618707 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.623531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.623603 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.623627 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.623654 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.623675 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.660198 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:55 crc kubenswrapper[4795]: E0310 15:07:55.660472 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.662160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.662216 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.662240 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.662269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.662291 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.767792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.767880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.767902 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.767933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.767967 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.871967 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.872015 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.872027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.872043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.872085 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.970188 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" event={"ID":"ec2edb56-6323-4261-8954-3e75a645ed42","Type":"ContainerStarted","Data":"93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.975908 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.975944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.975958 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.975978 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.975992 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:55Z","lastTransitionTime":"2026-03-10T15:07:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.976794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.977594 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.978252 4795 scope.go:117] "RemoveContainer" containerID="05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.978847 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.978896 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.978970 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.982017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jc8ps" event={"ID":"859bda50-59de-4471-a2d1-785d7b1d06d7","Type":"ContainerStarted","Data":"035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.982072 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-jc8ps" event={"ID":"859bda50-59de-4471-a2d1-785d7b1d06d7","Type":"ContainerStarted","Data":"91c0b6520d115424fa12c4a708b27df015e75ca83c07f719ea4d1e79721675b3"} Mar 10 15:07:55 crc kubenswrapper[4795]: I0310 15:07:55.989980 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:55Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.007972 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.013498 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.014525 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.034709 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.058428 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.070921 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.079083 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.079117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.079141 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.079158 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.079169 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.088409 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.104303 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.123869 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.133279 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.145812 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.158833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.170615 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181097 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.181354 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.192591 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.202469 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.215157 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.227139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.245820 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-acl-logging ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.265637 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.281155 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.285831 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.285927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.286269 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.286339 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.286359 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.297145 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.310023 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.325974 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.338014 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.349236 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.362496 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.372903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.386490 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.388914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.388933 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.388942 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.388956 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.388964 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.407570 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.419229 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:56Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.491739 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.491792 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.491808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.491829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.491845 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.595394 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.595457 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.595470 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.595488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.595500 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.698843 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.698926 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.698963 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.698995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.699021 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.802018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.802162 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.802188 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.802220 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.802241 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.905513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.905573 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.905590 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.905614 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.905633 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:56Z","lastTransitionTime":"2026-03-10T15:07:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.992579 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:07:56 crc kubenswrapper[4795]: I0310 15:07:56.993788 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.008820 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.008868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.008884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.008905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.008922 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.017120 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.034117 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.048558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.070859 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.091700 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.111733 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.111793 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.111809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.111834 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.111850 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.112616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.131375 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.151765 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.174512 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.208519 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.214639 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.214702 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.214720 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.214746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.214766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.240841 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.260881 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.289582 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.309772 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.317860 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.317921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.317946 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.317976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.318001 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.344617 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.424460 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.424531 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.424550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.424576 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.424598 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.475925 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:57 crc kubenswrapper[4795]: E0310 15:07:57.476083 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.476101 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.476144 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:57 crc kubenswrapper[4795]: E0310 15:07:57.476214 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:57 crc kubenswrapper[4795]: E0310 15:07:57.476401 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.502506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.516210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.527089 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.527148 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.527168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.527193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.527219 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.528707 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.537456 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.549083 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.559208 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.572776 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.585674 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.597908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.616520 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.629869 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.629919 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.629973 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.629983 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.629998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.630008 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.642359 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.654336 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.668139 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.690596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:57Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.732122 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.732160 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.732170 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.732182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.732191 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.879118 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.879159 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.879168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.879182 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.879190 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.981398 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.981437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.981447 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.981462 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:57 crc kubenswrapper[4795]: I0310 15:07:57.981472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:57Z","lastTransitionTime":"2026-03-10T15:07:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.084040 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.084106 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.084125 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.084145 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.084156 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.186520 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.186856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.186867 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.186885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.186893 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.288517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.288553 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.288567 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.288584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.288596 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.391735 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.391773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.391783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.391799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.391810 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.494446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.494488 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.494499 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.494517 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.494529 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.596828 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.596904 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.596927 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.596957 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.596983 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.699367 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.699406 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.699418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.699435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.699447 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.802016 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.802059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.802100 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.802120 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.802133 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.904677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.904743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.904759 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.904783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:58 crc kubenswrapper[4795]: I0310 15:07:58.904800 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:58Z","lastTransitionTime":"2026-03-10T15:07:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.002708 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/0.log" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007295 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007313 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007338 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007337 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.007356 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.008420 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9" exitCode=1 Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.008470 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.009467 4795 scope.go:117] "RemoveContainer" containerID="024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.032932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.051196 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.068463 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.083352 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.109668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.109729 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.109748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.109773 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.109793 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.117435 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.131940 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.146708 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.161234 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.176211 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.192204 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"emoval\\\\nI0310 15:07:58.259833 6629 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:07:58.259843 6629 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:07:58.259866 6629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:58.259879 6629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:58.259882 6629 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:58.259907 6629 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:58.259924 6629 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:58.259930 6629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:58.259939 6629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:58.259948 6629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:58.259959 6629 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:58.259981 6629 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:58.260001 6629 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:07:58.260026 6629 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:07:58.260056 6629 factory.go:656] Stopping watch factory\\\\nI0310 15:07:58.260137 6629 ovnkube.go:599] Stopped ovnkube\\\\nI0310 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.211652 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.211717 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.211742 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.211771 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.211795 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.220491 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.233677 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.252814 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.264870 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.278875 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:07:59Z is after 2025-08-24T17:21:41Z" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.366644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.366707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.366724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.366748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.366765 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.399295 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.399331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399428 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399552 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:31.399535959 +0000 UTC m=+144.565276857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399565 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399608 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399629 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.399700 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:31.399677232 +0000 UTC m=+144.565418170 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.469451 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.469508 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.469526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.469550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.469568 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.475905 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.476025 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.476148 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.476239 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.475912 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.476366 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.500264 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.500434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.500493 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.500713 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.500813 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:31.500785686 +0000 UTC m=+144.666526624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.501267 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:08:31.501244108 +0000 UTC m=+144.666985066 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.501421 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.501459 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.501482 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:59 crc kubenswrapper[4795]: E0310 15:07:59.501540 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:31.501523516 +0000 UTC m=+144.667264454 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.572879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.573116 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.573193 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.573268 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.573345 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.675288 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.675487 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.675547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.675642 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.675699 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.778149 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.778228 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.778255 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.778285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.778307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.880395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.880434 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.880444 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.880458 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.880467 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.984153 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.984209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.984261 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.984292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:07:59 crc kubenswrapper[4795]: I0310 15:07:59.984314 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:07:59Z","lastTransitionTime":"2026-03-10T15:07:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.013945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/0.log" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.016377 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.017105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.018390 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.036936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.048193 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.062816 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.085777 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.086441 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.086504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.086516 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.086534 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.086548 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.098135 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.109449 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.122451 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.134030 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.149415 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.174396 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"emoval\\\\nI0310 15:07:58.259833 6629 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:07:58.259843 6629 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:07:58.259866 6629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:58.259879 6629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:58.259882 6629 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:58.259907 6629 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:58.259924 6629 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:58.259930 6629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:58.259939 6629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:58.259948 6629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:58.259959 6629 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:58.259981 6629 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:58.260001 6629 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:07:58.260026 6629 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:07:58.260056 6629 factory.go:656] Stopping watch factory\\\\nI0310 15:07:58.260137 6629 ovnkube.go:599] Stopped ovnkube\\\\nI0310 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188422 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188612 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188619 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188631 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.188639 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.203621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.223631 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.251474 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.273340 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.290744 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.290808 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.290829 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.290855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.290873 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.394358 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.394414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.394430 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.394453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.394472 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.497245 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.497312 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.497331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.497354 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.497373 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.512343 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk"] Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.513014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.515533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.515831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.534043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.557062 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.575428 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.594506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.599986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.600049 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.600133 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.600161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.600181 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.611368 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.614810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.614890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82dpl\" (UniqueName: \"kubernetes.io/projected/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-kube-api-access-82dpl\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.614947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.615112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.626692 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.647392 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.663955 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.681377 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.699791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.702840 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.702925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.702953 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.702982 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.703006 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.716575 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.716653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82dpl\" (UniqueName: \"kubernetes.io/projected/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-kube-api-access-82dpl\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.716707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.716802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.717900 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.718008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.718191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.729230 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.743892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82dpl\" (UniqueName: \"kubernetes.io/projected/ce2b7f29-4124-474a-9f51-18aa60a6fdfb-kube-api-access-82dpl\") pod \"ovnkube-control-plane-749d76644c-6kmmk\" (UID: \"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.748409 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"emoval\\\\nI0310 15:07:58.259833 6629 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:07:58.259843 6629 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:07:58.259866 6629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:58.259879 6629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:58.259882 6629 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:58.259907 6629 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:58.259924 6629 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:58.259930 6629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:58.259939 6629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:58.259948 6629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:58.259959 6629 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:58.259981 6629 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:58.260001 6629 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:07:58.260026 6629 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:07:58.260056 6629 factory.go:656] Stopping watch factory\\\\nI0310 15:07:58.260137 6629 ovnkube.go:599] Stopped ovnkube\\\\nI0310 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.782248 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.800348 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.805921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.806019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.806038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.806062 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.806114 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.822796 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.837332 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.838863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:00 crc kubenswrapper[4795]: W0310 15:08:00.859304 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2b7f29_4124_474a_9f51_18aa60a6fdfb.slice/crio-f6b88641b4b5823229b7027b9ad057399d8d44494f1a800719439377e1f06ee1 WatchSource:0}: Error finding container f6b88641b4b5823229b7027b9ad057399d8d44494f1a800719439377e1f06ee1: Status 404 returned error can't find the container with id f6b88641b4b5823229b7027b9ad057399d8d44494f1a800719439377e1f06ee1 Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.915481 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.915529 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.915546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.915566 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:00 crc kubenswrapper[4795]: I0310 15:08:00.915581 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:00Z","lastTransitionTime":"2026-03-10T15:08:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.017986 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.018027 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.018038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.018103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.018117 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.021902 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/1.log" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.023264 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/0.log" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.025876 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.026799 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730" exitCode=1 Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.026853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.026899 4795 scope.go:117] "RemoveContainer" containerID="024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.027638 4795 scope.go:117] "RemoveContainer" containerID="5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.027846 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.028569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" event={"ID":"ce2b7f29-4124-474a-9f51-18aa60a6fdfb","Type":"ContainerStarted","Data":"f6b88641b4b5823229b7027b9ad057399d8d44494f1a800719439377e1f06ee1"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.049190 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.064695 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.082959 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.102689 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.120466 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.120486 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.120494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.120506 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.120515 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.130862 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.166995 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.183764 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.194962 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.210132 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.222043 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.222082 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.222091 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.222103 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.222112 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.238189 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"emoval\\\\nI0310 15:07:58.259833 6629 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:07:58.259843 6629 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:07:58.259866 6629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:58.259879 6629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:58.259882 6629 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:58.259907 6629 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:58.259924 6629 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:58.259930 6629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:58.259939 6629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:58.259948 6629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:58.259959 6629 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:58.259981 6629 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:58.260001 6629 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:07:58.260026 6629 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:07:58.260056 6629 factory.go:656] Stopping watch factory\\\\nI0310 15:07:58.260137 6629 ovnkube.go:599] Stopped ovnkube\\\\nI0310 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.251293 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.265152 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.272613 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-zjg2f"] Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.273032 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.273106 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.284858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.297123 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.316412 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.321335 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf46p\" (UniqueName: \"kubernetes.io/projected/3036349b-f184-48aa-b5ab-de9c5c7ae511-kube-api-access-nf46p\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.321457 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.324252 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.324299 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.324320 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.324348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.324366 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.332545 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.348304 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.369432 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.388921 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.400465 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.412791 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.422808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.422885 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf46p\" (UniqueName: \"kubernetes.io/projected/3036349b-f184-48aa-b5ab-de9c5c7ae511-kube-api-access-nf46p\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.423032 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.423148 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:01.923125437 +0000 UTC m=+115.088866345 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.427181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.427267 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.427293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.427327 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.427351 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.428629 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.441651 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf46p\" (UniqueName: \"kubernetes.io/projected/3036349b-f184-48aa-b5ab-de9c5c7ae511-kube-api-access-nf46p\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.444173 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.463999 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.475925 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.475972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.476031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.476118 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.476274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.476386 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.480021 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.492418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.509282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.526361 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.531156 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.531221 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.531243 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.531317 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.531368 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.557922 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://024c1db29c00520748bd4269132f38113b6b510ead87d6ff56d5ef5b4a5a7ed9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:58Z\\\",\\\"message\\\":\\\"emoval\\\\nI0310 15:07:58.259833 6629 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0310 15:07:58.259843 6629 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0310 15:07:58.259866 6629 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0310 15:07:58.259879 6629 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0310 15:07:58.259882 6629 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0310 15:07:58.259907 6629 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0310 15:07:58.259924 6629 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0310 15:07:58.259930 6629 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0310 15:07:58.259939 6629 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0310 15:07:58.259948 6629 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0310 15:07:58.259959 6629 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0310 15:07:58.259981 6629 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0310 15:07:58.260001 6629 handler.go:208] Removed *v1.Node event handler 2\\\\nI0310 15:07:58.260026 6629 handler.go:208] Removed *v1.Node event handler 7\\\\nI0310 15:07:58.260056 6629 factory.go:656] Stopping watch factory\\\\nI0310 15:07:58.260137 6629 ovnkube.go:599] Stopped ovnkube\\\\nI0310 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.589354 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.610492 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.629342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.634297 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.634331 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.634343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.634361 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.634373 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.643622 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:01Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.737725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.737800 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.737825 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.737855 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.737877 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.841821 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.841928 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.842017 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.842048 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.842101 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.933115 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.933321 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:01 crc kubenswrapper[4795]: E0310 15:08:01.933411 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:02.933388928 +0000 UTC m=+116.099129866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.944472 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.944532 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.944555 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.944584 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:01 crc kubenswrapper[4795]: I0310 15:08:01.944607 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:01Z","lastTransitionTime":"2026-03-10T15:08:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.035762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/1.log" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.040003 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.042434 4795 scope.go:117] "RemoveContainer" containerID="5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730" Mar 10 15:08:02 crc kubenswrapper[4795]: E0310 15:08:02.042743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.044474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" event={"ID":"ce2b7f29-4124-474a-9f51-18aa60a6fdfb","Type":"ContainerStarted","Data":"fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.044539 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" event={"ID":"ce2b7f29-4124-474a-9f51-18aa60a6fdfb","Type":"ContainerStarted","Data":"56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.047272 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.047323 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.047340 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.047360 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.047377 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.075207 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.096987 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.116131 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.146896 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.150707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.150741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.150752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.150767 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.150779 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.165371 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.197168 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.214864 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.236456 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.253754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.255253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.255334 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.255356 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.255380 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.255400 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.270385 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.290295 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.310908 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.330992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.350563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.359480 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.359550 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.359574 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.359606 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.359627 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.367282 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.384558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.407213 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.427440 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.444506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.462446 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.463238 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.463308 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.463332 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.463362 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.463388 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.484596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.506950 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.527135 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.547050 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.566882 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.566949 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.566971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.567005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.567028 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.571786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.591441 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.621653 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.655450 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.669914 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.669975 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.669992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.670019 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.670036 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.674752 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.697439 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.714786 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.728847 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.750472 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.767775 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:02Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.773885 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.773964 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.773992 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.774023 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.774045 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.877483 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.877544 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.877554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.877571 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.877584 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.950000 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:02 crc kubenswrapper[4795]: E0310 15:08:02.950195 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:02 crc kubenswrapper[4795]: E0310 15:08:02.950270 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:04.950254811 +0000 UTC m=+118.115995709 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.980822 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.980868 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.980878 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.980894 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:02 crc kubenswrapper[4795]: I0310 15:08:02.980905 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:02Z","lastTransitionTime":"2026-03-10T15:08:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.083998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.084038 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.084050 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.084094 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.084110 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.187641 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.187724 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.187748 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.187777 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.187800 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.290999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.291059 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.291107 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.291130 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.291149 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.394883 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.394972 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.394999 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.395144 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.395200 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.476587 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:03 crc kubenswrapper[4795]: E0310 15:08:03.476772 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.476898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.477099 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:03 crc kubenswrapper[4795]: E0310 15:08:03.477113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.477190 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:03 crc kubenswrapper[4795]: E0310 15:08:03.477388 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:03 crc kubenswrapper[4795]: E0310 15:08:03.477563 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.532961 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.533111 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.533132 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.533155 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.533175 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.636770 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.636833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.636853 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.636879 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.636897 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.739707 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.739756 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.739772 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.739794 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.739811 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.842778 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.842856 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.842880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.842910 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.842933 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.946032 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.946151 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.946176 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.946305 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:03 crc kubenswrapper[4795]: I0310 15:08:03.946328 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:03Z","lastTransitionTime":"2026-03-10T15:08:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.050175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.050235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.050262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.050292 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.050316 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.152704 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.152776 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.152799 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.152823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.152840 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.256439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.256523 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.256561 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.256598 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.256622 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.359348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.359414 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.359439 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.359469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.359506 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.462607 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.462677 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.462699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.462728 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.462751 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.565608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.565718 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.565747 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.565782 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.565805 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.670514 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.670602 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.670628 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.670662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.670685 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.773200 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.773254 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.773277 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.773306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.773323 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.877565 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.877648 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.877668 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.877699 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.877724 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.971740 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:04 crc kubenswrapper[4795]: E0310 15:08:04.971979 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:04 crc kubenswrapper[4795]: E0310 15:08:04.972052 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:08.972028779 +0000 UTC m=+122.137769707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.980746 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.980801 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.980823 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.980851 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:04 crc kubenswrapper[4795]: I0310 15:08:04.980872 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:04Z","lastTransitionTime":"2026-03-10T15:08:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.084234 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.084286 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.084302 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.084325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.084342 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.186976 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.187058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.187115 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.187152 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.187173 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.289713 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.289789 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.289807 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.289833 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.289851 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.392971 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.393034 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.393051 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.393101 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.393121 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.475504 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.475601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.475663 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.475550 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.475907 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.476050 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.476216 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.476284 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.496014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.496099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.496117 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.496146 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.496164 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.605257 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.605353 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.605378 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.605413 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.605437 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.708880 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.708947 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.708965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.708994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.709016 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.812395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.812837 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.812857 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.812884 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.812903 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.874306 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.874357 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.874368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.874390 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.874402 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.891875 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.896368 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.896450 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.896475 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.896498 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.896516 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.917385 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.921463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.921530 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.921554 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.921585 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.921608 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.943234 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.949495 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.949547 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.949561 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.949581 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.949593 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.970193 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.975121 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.975161 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.975173 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.975190 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.975201 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.995707 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:05Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:05 crc kubenswrapper[4795]: E0310 15:08:05.995859 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.998392 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.998437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.998453 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.998476 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:05 crc kubenswrapper[4795]: I0310 15:08:05.998494 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:05Z","lastTransitionTime":"2026-03-10T15:08:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.102579 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.102644 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.102662 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.102688 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.102706 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.206168 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.206236 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.206259 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.206285 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.206304 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.309905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.309977 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.309994 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.310024 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.310043 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.412850 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.412918 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.412938 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.412965 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.412986 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.516348 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.516405 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.516422 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.516446 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.516464 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.619676 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.619741 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.619760 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.619783 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.619801 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.723127 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.723207 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.723230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.723256 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.723272 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.826921 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.826995 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.827018 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.827045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.827104 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.930886 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.930945 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.930962 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.930987 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:06 crc kubenswrapper[4795]: I0310 15:08:06.931004 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:06Z","lastTransitionTime":"2026-03-10T15:08:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.033798 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.033899 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.033920 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.033944 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.033960 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:07Z","lastTransitionTime":"2026-03-10T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.137832 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.137901 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.137925 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.137952 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.137973 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:07Z","lastTransitionTime":"2026-03-10T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.242395 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.242463 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.242596 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.242692 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.242713 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:07Z","lastTransitionTime":"2026-03-10T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.346126 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.346209 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.346235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.346266 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.346292 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:07Z","lastTransitionTime":"2026-03-10T15:08:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.447454 4795 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.476431 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.476508 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.476575 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.476714 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.476732 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.477021 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.477178 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.477354 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.495808 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.519850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.538768 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: E0310 15:08:07.573722 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.574927 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.597355 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.611362 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.626637 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.646325 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.665534 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.683153 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.704808 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.717757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.730433 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.747992 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.768346 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.791477 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:07 crc kubenswrapper[4795]: I0310 15:08:07.813048 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:07Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:09 crc kubenswrapper[4795]: I0310 15:08:09.018767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.018991 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.019118 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:17.019093512 +0000 UTC m=+130.184834450 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:09 crc kubenswrapper[4795]: I0310 15:08:09.476739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:09 crc kubenswrapper[4795]: I0310 15:08:09.476731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:09 crc kubenswrapper[4795]: I0310 15:08:09.476851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:09 crc kubenswrapper[4795]: I0310 15:08:09.476939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.477144 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.477360 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.477460 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:09 crc kubenswrapper[4795]: E0310 15:08:09.477546 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.108999 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.130699 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.147611 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.163033 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.186710 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.205141 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.227290 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.244419 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.272161 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.288214 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.312906 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.330560 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.341528 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.359389 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.371784 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.382934 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.396471 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.408004 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:11Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.475701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:11 crc kubenswrapper[4795]: E0310 15:08:11.475857 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.475972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.476131 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:11 crc kubenswrapper[4795]: I0310 15:08:11.476183 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:11 crc kubenswrapper[4795]: E0310 15:08:11.476366 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:11 crc kubenswrapper[4795]: E0310 15:08:11.476594 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:11 crc kubenswrapper[4795]: E0310 15:08:11.476716 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:12 crc kubenswrapper[4795]: E0310 15:08:12.575376 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:13 crc kubenswrapper[4795]: I0310 15:08:13.476348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:13 crc kubenswrapper[4795]: E0310 15:08:13.476533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:13 crc kubenswrapper[4795]: I0310 15:08:13.476552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:13 crc kubenswrapper[4795]: I0310 15:08:13.476587 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:13 crc kubenswrapper[4795]: E0310 15:08:13.476745 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:13 crc kubenswrapper[4795]: I0310 15:08:13.476802 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:13 crc kubenswrapper[4795]: E0310 15:08:13.476987 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:13 crc kubenswrapper[4795]: E0310 15:08:13.477131 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.475617 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.475688 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.475804 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.475953 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:15 crc kubenswrapper[4795]: E0310 15:08:15.475992 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:15 crc kubenswrapper[4795]: E0310 15:08:15.476348 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:15 crc kubenswrapper[4795]: E0310 15:08:15.478857 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:15 crc kubenswrapper[4795]: E0310 15:08:15.479126 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.484524 4795 scope.go:117] "RemoveContainer" containerID="5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730" Mar 10 15:08:15 crc kubenswrapper[4795]: I0310 15:08:15.492999 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.104417 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/1.log" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.108295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.109311 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849"} Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.110281 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.129864 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.147939 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.161036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.178596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.196769 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.210465 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.223918 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.245580 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.259426 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.259494 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.259513 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.259541 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.259560 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:16Z","lastTransitionTime":"2026-03-10T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.278009 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.282435 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.282489 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.282504 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.282526 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.282541 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:16Z","lastTransitionTime":"2026-03-10T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.284091 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.297650 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302010 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302166 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302199 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302210 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302226 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.302237 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:16Z","lastTransitionTime":"2026-03-10T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.318794 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.322148 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.322989 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.323029 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.323042 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.323058 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.323081 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:16Z","lastTransitionTime":"2026-03-10T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.336167 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.339257 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.340136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.340169 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.340179 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.340196 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.340208 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:16Z","lastTransitionTime":"2026-03-10T15:08:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.351034 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.352797 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: E0310 15:08:16.353039 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.367945 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.388008 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.398956 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.416654 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:16 crc kubenswrapper[4795]: I0310 15:08:16.428935 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:16Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.114889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.115191 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.115328 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:08:33.115292466 +0000 UTC m=+146.281033404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.115931 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/2.log" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.117662 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/1.log" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.121919 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.123214 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" exitCode=1 Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.123288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849"} Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.123347 4795 scope.go:117] "RemoveContainer" containerID="5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.124620 4795 scope.go:117] "RemoveContainer" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.124904 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.146349 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.167340 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.198500 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.229846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.248824 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.271564 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.288757 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.304846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.326752 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.344043 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.363176 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.377774 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.392523 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.414471 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.433329 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.452266 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.475937 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.475981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.476009 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.475900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.476199 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.476372 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.476505 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.476677 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.505475 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.531616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.544410 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.558218 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: E0310 15:08:17.575790 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.583532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.596576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.609735 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.625922 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.640705 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.661125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.683439 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.705866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.728144 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ab0ed8cd15c0a33276f1df08996b38724146248122fcdfcb55bce451641e730\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"message\\\":\\\"services.LB{}\\\\nF0310 15:08:00.077347 6834 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:00Z is after 2025-08-24T17:21:41Z]\\\\nI0310 15:08:00.077348 6834 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0310 15:08:00.077360 6834 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0310 15:08:00.077351 6834 services_controller.go:451] Built service openshift-consol\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.754672 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.771789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.791125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.809573 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.822596 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.840476 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:17 crc kubenswrapper[4795]: I0310 15:08:17.857217 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:17Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.131933 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/2.log" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.136704 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.139433 4795 scope.go:117] "RemoveContainer" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" Mar 10 15:08:18 crc kubenswrapper[4795]: E0310 15:08:18.139771 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.156515 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.172815 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.190025 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.209936 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.227624 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.245852 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.262574 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.286976 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.306185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.326932 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.348297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.369227 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.389945 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.422417 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.456518 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.475753 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.501597 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:18 crc kubenswrapper[4795]: I0310 15:08:18.521010 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:18Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:19 crc kubenswrapper[4795]: I0310 15:08:19.475871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:19 crc kubenswrapper[4795]: I0310 15:08:19.476611 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:19 crc kubenswrapper[4795]: I0310 15:08:19.476623 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:19 crc kubenswrapper[4795]: I0310 15:08:19.476750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:19 crc kubenswrapper[4795]: E0310 15:08:19.476767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:19 crc kubenswrapper[4795]: E0310 15:08:19.476924 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:19 crc kubenswrapper[4795]: E0310 15:08:19.477302 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:19 crc kubenswrapper[4795]: E0310 15:08:19.477430 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:21 crc kubenswrapper[4795]: I0310 15:08:21.476374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:21 crc kubenswrapper[4795]: I0310 15:08:21.476476 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:21 crc kubenswrapper[4795]: I0310 15:08:21.476529 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:21 crc kubenswrapper[4795]: I0310 15:08:21.476688 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:21 crc kubenswrapper[4795]: E0310 15:08:21.476680 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:21 crc kubenswrapper[4795]: E0310 15:08:21.476821 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:21 crc kubenswrapper[4795]: E0310 15:08:21.477029 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:21 crc kubenswrapper[4795]: E0310 15:08:21.477135 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:22 crc kubenswrapper[4795]: E0310 15:08:22.577782 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:23 crc kubenswrapper[4795]: I0310 15:08:23.475898 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:23 crc kubenswrapper[4795]: E0310 15:08:23.476355 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:23 crc kubenswrapper[4795]: I0310 15:08:23.476422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:23 crc kubenswrapper[4795]: E0310 15:08:23.476738 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:23 crc kubenswrapper[4795]: I0310 15:08:23.476522 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:23 crc kubenswrapper[4795]: I0310 15:08:23.476473 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:23 crc kubenswrapper[4795]: E0310 15:08:23.477318 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:23 crc kubenswrapper[4795]: E0310 15:08:23.477200 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:23 crc kubenswrapper[4795]: I0310 15:08:23.492410 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 10 15:08:25 crc kubenswrapper[4795]: I0310 15:08:25.475862 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:25 crc kubenswrapper[4795]: E0310 15:08:25.476107 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:25 crc kubenswrapper[4795]: I0310 15:08:25.476499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:25 crc kubenswrapper[4795]: E0310 15:08:25.476625 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:25 crc kubenswrapper[4795]: I0310 15:08:25.476920 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:25 crc kubenswrapper[4795]: E0310 15:08:25.477123 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:25 crc kubenswrapper[4795]: I0310 15:08:25.477554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:25 crc kubenswrapper[4795]: E0310 15:08:25.477700 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.479609 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.479687 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.479709 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.479743 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.479767 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:26Z","lastTransitionTime":"2026-03-10T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.504571 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.510497 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.510546 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.510558 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.510575 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.510587 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:26Z","lastTransitionTime":"2026-03-10T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.527690 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.532690 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.532725 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.532736 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.532754 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.532766 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:26Z","lastTransitionTime":"2026-03-10T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.546880 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.552147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.552349 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.552608 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.552809 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.552958 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:26Z","lastTransitionTime":"2026-03-10T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.572928 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.578522 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.578752 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.578905 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.579099 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:26 crc kubenswrapper[4795]: I0310 15:08:26.579273 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:26Z","lastTransitionTime":"2026-03-10T15:08:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.595209 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:26Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:26 crc kubenswrapper[4795]: E0310 15:08:26.595457 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.476448 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.476503 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:27 crc kubenswrapper[4795]: E0310 15:08:27.476706 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:27 crc kubenswrapper[4795]: E0310 15:08:27.476894 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.477538 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:27 crc kubenswrapper[4795]: E0310 15:08:27.478267 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.478495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:27 crc kubenswrapper[4795]: E0310 15:08:27.478856 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.492236 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.511303 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.532125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.547527 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.560219 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: E0310 15:08:27.578562 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.582201 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.600858 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.619158 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.638747 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.655878 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.676585 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.691882 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.707656 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.725899 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.746091 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.770746 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.786715 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.811267 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:27 crc kubenswrapper[4795]: I0310 15:08:27.831413 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:27Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:29 crc kubenswrapper[4795]: I0310 15:08:29.476627 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:29 crc kubenswrapper[4795]: I0310 15:08:29.476680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:29 crc kubenswrapper[4795]: I0310 15:08:29.476654 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:29 crc kubenswrapper[4795]: I0310 15:08:29.476810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:29 crc kubenswrapper[4795]: E0310 15:08:29.476991 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:29 crc kubenswrapper[4795]: E0310 15:08:29.477106 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:29 crc kubenswrapper[4795]: E0310 15:08:29.477184 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:29 crc kubenswrapper[4795]: E0310 15:08:29.478048 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.472503 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.472585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472753 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472752 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472819 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472840 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472849 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:35.472822012 +0000 UTC m=+208.638562940 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.472935 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:35.472907474 +0000 UTC m=+208.638648412 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.476497 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.476601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.476601 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.476681 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.476794 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.476824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.477112 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.477292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.574250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.574464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.574899 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:35.574859579 +0000 UTC m=+208.740600507 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:08:31 crc kubenswrapper[4795]: I0310 15:08:31.574956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575186 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575258 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575270 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575290 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575395 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:35.575362754 +0000 UTC m=+208.741103692 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:08:31 crc kubenswrapper[4795]: E0310 15:08:31.575437 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:35.575420755 +0000 UTC m=+208.741161733 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:08:32 crc kubenswrapper[4795]: I0310 15:08:32.477471 4795 scope.go:117] "RemoveContainer" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" Mar 10 15:08:32 crc kubenswrapper[4795]: E0310 15:08:32.477922 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:32 crc kubenswrapper[4795]: E0310 15:08:32.579604 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:33 crc kubenswrapper[4795]: I0310 15:08:33.195487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.195681 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.195810 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:05.195774273 +0000 UTC m=+178.361515211 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:08:33 crc kubenswrapper[4795]: I0310 15:08:33.475923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:33 crc kubenswrapper[4795]: I0310 15:08:33.476014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:33 crc kubenswrapper[4795]: I0310 15:08:33.476020 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:33 crc kubenswrapper[4795]: I0310 15:08:33.475960 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.476233 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.476313 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.476477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:33 crc kubenswrapper[4795]: E0310 15:08:33.476584 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:35 crc kubenswrapper[4795]: I0310 15:08:35.476571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:35 crc kubenswrapper[4795]: I0310 15:08:35.476602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:35 crc kubenswrapper[4795]: I0310 15:08:35.476579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:35 crc kubenswrapper[4795]: I0310 15:08:35.476567 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:35 crc kubenswrapper[4795]: E0310 15:08:35.476721 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:35 crc kubenswrapper[4795]: E0310 15:08:35.476959 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:35 crc kubenswrapper[4795]: E0310 15:08:35.477039 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:35 crc kubenswrapper[4795]: E0310 15:08:35.477162 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.206414 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/0.log" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.206516 4795 generic.go:334] "Generic (PLEG): container finished" podID="589b366f-9132-43cc-8d7a-d401d396bf06" containerID="35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6" exitCode=1 Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.206580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerDied","Data":"35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6"} Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.207305 4795 scope.go:117] "RemoveContainer" containerID="35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.222448 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.241638 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.264592 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.281759 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.297660 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.313411 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.334479 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.352717 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.369399 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.384239 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.397665 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.410922 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.425185 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.447126 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.479563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.513842 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.531951 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.557297 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.575339 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.979810 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.979864 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.979877 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.979897 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.979913 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:36Z","lastTransitionTime":"2026-03-10T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:36 crc kubenswrapper[4795]: E0310 15:08:36.994386 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:36Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.999206 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.999253 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.999271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.999291 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:36 crc kubenswrapper[4795]: I0310 15:08:36.999307 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:36Z","lastTransitionTime":"2026-03-10T15:08:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.024977 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.030325 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.030418 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.030437 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.030461 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.030479 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:37Z","lastTransitionTime":"2026-03-10T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.051807 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.057143 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.057211 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.057231 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.057262 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.057287 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:37Z","lastTransitionTime":"2026-03-10T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.079015 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.084045 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.084128 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.084147 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.084175 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.084198 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:37Z","lastTransitionTime":"2026-03-10T15:08:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.099308 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.099555 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.214131 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/0.log" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.214226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerStarted","Data":"e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6"} Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.233871 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.272706 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.290833 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.316211 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.331465 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.349611 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.370481 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.390253 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.409100 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.427538 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.443946 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.460259 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.476323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.476348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.476375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.476474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.476639 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.476961 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.477277 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.477376 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.477429 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.504690 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.523036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.541621 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.562823 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: E0310 15:08:37.580522 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.584205 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.615719 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.635965 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.655639 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.677835 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.693342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.708524 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.727695 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.742734 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.758283 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.780426 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.804351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.838707 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.873506 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.889199 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.911350 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.929416 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.942532 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.959128 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.977735 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:37 crc kubenswrapper[4795]: I0310 15:08:37.992280 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:37Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:39 crc kubenswrapper[4795]: I0310 15:08:39.476405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:39 crc kubenswrapper[4795]: I0310 15:08:39.476419 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:39 crc kubenswrapper[4795]: E0310 15:08:39.477423 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:39 crc kubenswrapper[4795]: I0310 15:08:39.476542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:39 crc kubenswrapper[4795]: I0310 15:08:39.476460 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:39 crc kubenswrapper[4795]: E0310 15:08:39.477602 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:39 crc kubenswrapper[4795]: E0310 15:08:39.477720 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:39 crc kubenswrapper[4795]: E0310 15:08:39.477793 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:41 crc kubenswrapper[4795]: I0310 15:08:41.476446 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:41 crc kubenswrapper[4795]: I0310 15:08:41.476552 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:41 crc kubenswrapper[4795]: I0310 15:08:41.476583 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:41 crc kubenswrapper[4795]: I0310 15:08:41.476605 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:41 crc kubenswrapper[4795]: E0310 15:08:41.478158 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:41 crc kubenswrapper[4795]: E0310 15:08:41.480562 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:41 crc kubenswrapper[4795]: E0310 15:08:41.480837 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:41 crc kubenswrapper[4795]: E0310 15:08:41.481054 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:42 crc kubenswrapper[4795]: E0310 15:08:42.581611 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:43 crc kubenswrapper[4795]: I0310 15:08:43.476568 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:43 crc kubenswrapper[4795]: I0310 15:08:43.476657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:43 crc kubenswrapper[4795]: I0310 15:08:43.476740 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:43 crc kubenswrapper[4795]: E0310 15:08:43.476744 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:43 crc kubenswrapper[4795]: I0310 15:08:43.476816 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:43 crc kubenswrapper[4795]: E0310 15:08:43.477052 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:43 crc kubenswrapper[4795]: E0310 15:08:43.477177 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:43 crc kubenswrapper[4795]: E0310 15:08:43.477328 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:45 crc kubenswrapper[4795]: I0310 15:08:45.475777 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:45 crc kubenswrapper[4795]: I0310 15:08:45.475845 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:45 crc kubenswrapper[4795]: I0310 15:08:45.475982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:45 crc kubenswrapper[4795]: E0310 15:08:45.475985 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:45 crc kubenswrapper[4795]: I0310 15:08:45.476037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:45 crc kubenswrapper[4795]: E0310 15:08:45.476255 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:45 crc kubenswrapper[4795]: E0310 15:08:45.476399 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:45 crc kubenswrapper[4795]: E0310 15:08:45.476528 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:45 crc kubenswrapper[4795]: I0310 15:08:45.477809 4795 scope.go:117] "RemoveContainer" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.248819 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/2.log" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.252603 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.253569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.254149 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.282449 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.296789 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.323114 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.345510 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.355337 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.367968 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.377454 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.386754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.399281 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.410576 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.421696 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.437558 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.450903 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.468342 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.486804 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.503287 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.516867 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.528414 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:46 crc kubenswrapper[4795]: I0310 15:08:46.546563 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:46Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.260517 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.261901 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/2.log" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.266480 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.267860 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" exitCode=1 Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.267937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.267986 4795 scope.go:117] "RemoveContainer" containerID="51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.268966 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.269282 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.288979 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.307161 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.324125 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.338230 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.338289 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.338307 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.338333 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.338350 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:47Z","lastTransitionTime":"2026-03-10T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.347496 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.359101 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.364181 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.364235 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.364247 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.364264 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.364276 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:47Z","lastTransitionTime":"2026-03-10T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.370242 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.385707 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.389646 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.389998 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.390056 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.390105 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.390136 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.390157 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:47Z","lastTransitionTime":"2026-03-10T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.414859 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.418941 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.424271 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.424469 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.424658 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.424814 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.424953 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:47Z","lastTransitionTime":"2026-03-10T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.433866 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.444167 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.449014 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.449381 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.449519 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.449646 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.449808 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:47Z","lastTransitionTime":"2026-03-10T15:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.453447 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.472762 4795 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"2b8471d8-f6d6-4351-8edc-ecce171cc356\\\",\\\"systemUUID\\\":\\\"99de62c4-4c93-4a3b-bef3-57b8bbfec858\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.473033 4795 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.474406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.475599 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.475637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.475602 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.475749 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.475855 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.475977 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.476021 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.476571 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.499734 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:46Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:08:46.555303 7397 services_controller.go:434] Service openshift-ingress/router-internal-default retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{router-internal-default openshift-ingress ed650b1e-939d-4166-88db-ddadc6d5accd 5426 0 2025-02-23 05:23:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingresscontroller.operator.openshift.io/owning-ingresscontroller:default] map[service.alpha.openshift.io/serving-cert-secret-name:router-metrics-certs-default service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{apps/v1 Deployment router-default bb5b8eff-af7d-47c8-85cd-465e0b29b7d1 0xc0077f4a9e \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:http,Protocol:TCP,Port:80,TargetPort:{1 0 http},NodePort:0,AppProtocol:nil,},ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:1936,TargetPort:{1 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.526314 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.542754 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.559406 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.574184 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: E0310 15:08:47.582383 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.594386 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.635351 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.665258 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.685579 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.701733 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.721886 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.735616 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.754183 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.776897 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.792702 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.806686 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.821126 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.841210 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.859418 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.878593 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.897019 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.919347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.935846 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.949233 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.966581 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:47 crc kubenswrapper[4795]: I0310 15:08:47.985566 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:47Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.007863 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51d53b967123e49d686c069849bcca84cf0835469c8c47f0cd5904f9a044a849\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:16Z\\\",\\\"message\\\":\\\"led attempt(s)\\\\nI0310 15:08:16.538104 7073 default_network_controller.go:776] Recording success event on pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0310 15:08:16.538014 7073 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-operators\\\\\\\"}\\\\nI0310 15:08:16.537883 7073 ovn.go:134] Ensuring zone local for Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk in node crc\\\\nI0310 15:08:16.538132 7073 services_controller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.433726ms\\\\nI0310 15:08:16.538135 7073 obj_retry.go:386] Retry successful for *v1.Pod openshift-ovn-kubernetes/ovnkube-node-4q8gk after 0 failed attempt(s)\\\\nI0310 15:08:16.537918 7073 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-jc8ps\\\\nI0310 15:08:16.538147 7073 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-4q8gk\\\\nI0310 15:08:16.537898 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk\\\\nI0310 15:08:16.537793 7073 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-additional-cni-plugins-tn44z\\\\nI0310 15:08:16.537922 7073 obj_retry.go:386]\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:46Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:08:46.555303 7397 services_controller.go:434] Service openshift-ingress/router-internal-default retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{router-internal-default openshift-ingress ed650b1e-939d-4166-88db-ddadc6d5accd 5426 0 2025-02-23 05:23:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingresscontroller.operator.openshift.io/owning-ingresscontroller:default] map[service.alpha.openshift.io/serving-cert-secret-name:router-metrics-certs-default service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{apps/v1 Deployment router-default bb5b8eff-af7d-47c8-85cd-465e0b29b7d1 0xc0077f4a9e \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:http,Protocol:TCP,Port:80,TargetPort:{1 0 http},NodePort:0,AppProtocol:nil,},ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:1936,TargetPort:{1 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.028730 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.274387 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.277667 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.279321 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:08:48 crc kubenswrapper[4795]: E0310 15:08:48.279500 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.298520 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9f5aba50-acc2-446e-b3b4-e207c63e23aa\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8aebc5998361991639e6ddf8e50056ef5c463cfcb974312dbb4fd82f27def10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1669b97ddce8c5408cdec6f7a88d28e60f172083f19a3eaccbc7ca3bf7830d49\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.319850 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"77f99bfb-549e-4e7e-9d7a-baf44c301a0c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e0d5ea865b0805fae58dc7df068aadebac4d089968699012a9979b4144410aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e6d1861ba244e362fcb2792bdbc51398ad30b69476af94952419bcb4100fdf1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0310 15:06:37.969814 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0310 15:06:37.971387 1 observer_polling.go:159] Starting file observer\\\\nI0310 15:06:37.972633 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0310 15:06:37.976050 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0310 15:07:06.308828 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0310 15:07:07.495570 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0310 15:07:07.495755 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:37Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34142cc84eeec1486ccc227175b073882d898b9b6376d5fbb9c019fbe8ef1be3\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://721fcc3f29c16b316e4076833a39ab9b8ae4021210ed96b59b940498b9940bac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.337439 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-v49r8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"589b366f-9132-43cc-8d7a-d401d396bf06\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:35Z\\\",\\\"message\\\":\\\"2026-03-10T15:07:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af\\\\n2026-03-10T15:07:50+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f77ab94f-9f3d-4385-bed5-1bfb83bce6af to /host/opt/cni/bin/\\\\n2026-03-10T15:07:50Z [verbose] multus-daemon started\\\\n2026-03-10T15:07:50Z [verbose] Readiness Indicator file check\\\\n2026-03-10T15:08:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nbnsf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-v49r8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.350832 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3036349b-f184-48aa-b5ab-de9c5c7ae511\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf46p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-zjg2f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.365036 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e5d4b34d1e67bf5ed4606cbece483789096f6d7ea6515239dd6edcaa9f133c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.379952 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.390422 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-hmw5g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"729b6d95-56e9-4944-9397-28161f39fda6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f6044193206143bfcd37c59d1c916aa375c93033dca7e1ffb89981bc53692222\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-46vtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-hmw5g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.401883 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-jc8ps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"859bda50-59de-4471-a2d1-785d7b1d06d7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://035af564660401a521923a8d993b821ba35b2f9128dfdea828cc22fa6d9514d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k6vf9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:54Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-jc8ps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.413607 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b1ac9a-c515-42e3-ae9d-b05a058df9bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-10T15:07:07Z\\\",\\\"message\\\":\\\"W0310 15:07:06.747249 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0310 15:07:06.747916 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773155226 cert, and key in /tmp/serving-cert-798186730/serving-signer.crt, /tmp/serving-cert-798186730/serving-signer.key\\\\nI0310 15:07:07.416237 1 observer_polling.go:159] Starting file observer\\\\nW0310 15:07:07.430268 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0310 15:07:07.430488 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0310 15:07:07.431502 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-798186730/tls.crt::/tmp/serving-cert-798186730/tls.key\\\\\\\"\\\\nF0310 15:07:07.871568 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:06Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.424964 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"719f0c5f-022f-4d0e-9159-3a91404f1ee0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://630ef8ba07102f68e940ad97030a714f6eeceb6a314b2493fa321c3bd53a9a7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b19e9eda627403e2613751fd59483ecea6aa1cf763fff6165a34c30d2d0a9aae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8d49961a60d7bc3391437919466fd1ee8a00096d5e6acaecb6cf21658c1476bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22ac84d375b2d1696e9e083d5583e26934830c1546df9cb5f7e12f11453578f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.436953 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.450156 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:27Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.464806 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://889c332bd93660461691a20285472a90e5bfe18ab9acc59a4eac123b84430cdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.480401 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:28Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:41Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ad19e80982d9578f18bc4b985c594f266bb583ed66bc7f6bdedb1b420c0134b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615445d811ef0682848d10414d33a598807be6c1105d1d1b0a76275e67dbcd53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.497347 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"message\\\":\\\"++ K8S_NODE=\\\\n++ [[ -n '' ]]\\\\n++ northd_pidfile=/var/run/ovn/ovn-northd.pid\\\\n++ controller_pidfile=/var/run/ovn/ovn-controller.pid\\\\n++ controller_logfile=/var/log/ovn/acl-audit-log.log\\\\n++ vswitch_dbsock=/var/run/openvswitch/db.sock\\\\n++ nbdb_pidfile=/var/run/ovn/ovnnb_db.pid\\\\n++ nbdb_sock=/var/run/ovn/ovnnb_db.sock\\\\n++ nbdb_ctl=/var/run/ovn/ovnnb_db.ctl\\\\n++ sbdb_pidfile=/var/run/ovn/ovnsb_db.pid\\\\n++ sbdb_sock=/var/run/ovn/ovnsb_db.sock\\\\n++ sbdb_ctl=/var/run/ovn/ovnsb_db.ctl\\\\n+ start-audit-log-rotation\\\\n+ MAXFILESIZE=50000000\\\\n+ MAXLOGFILES=5\\\\n++ dirname /var/log/ovn/acl-audit-log.log\\\\n+ LOGDIR=/var/log/ovn\\\\n+ local retries=0\\\\n+ [[ 30 -gt 0 ]]\\\\n+ (( retries += 1 ))\\\\n++ cat /var/run/ovn/ovn-controller.pid\\\\ncat: /var/run/ovn/ovn-controller.pid: No such file or directory\\\\n+ CONTROLLERPID=\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-10T15:08:46Z\\\",\\\"message\\\":\\\"eighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.169:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {39432221-5995-412b-967b-35e1a9405ec7}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0310 15:08:46.555303 7397 services_controller.go:434] Service openshift-ingress/router-internal-default retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{router-internal-default openshift-ingress ed650b1e-939d-4166-88db-ddadc6d5accd 5426 0 2025-02-23 05:23:23 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[ingresscontroller.operator.openshift.io/owning-ingresscontroller:default] map[service.alpha.openshift.io/serving-cert-secret-name:router-metrics-certs-default service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [{apps/v1 Deployment router-default bb5b8eff-af7d-47c8-85cd-465e0b29b7d1 0xc0077f4a9e \\\\u003cnil\\\\u003e}] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:http,Protocol:TCP,Port:80,TargetPort:{1 0 http},NodePort:0,AppProtocol:nil,},ServicePort{Name:https,Protocol:TCP,Port:443,TargetPort:{1 0 https},NodePort:0,AppProtocol:nil,},ServicePort{Name:metrics,Protocol:TCP,Port:1936,TargetPort:{1 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-10T15:08:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r92zs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4q8gk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.515370 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8f1b0207-5433-47fa-b5d6-85fe52cb120e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:06:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://98e24a6384d9270e32c8b8e4d61113dc945611a7178b443bd685a27f21340816\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0dcacb079c8b7b0fda875e9a6c24d00ab11f65019363b870c9d6534eaf17f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://930006158be5746c370f39b14e37c9743cb4189c405890b3f443baa4052c3544\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4929e2355ff033b19c2f912ca7cbdb5c3baeb76556860f6a312964c2d0209b50\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://db5430a2503fa0b72ab7ad540eab947f1ce8297aead9afefa3d159519f7d9731\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:06:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8fffd4003fad56d04ad2231da2103579022a0a52d859764d90e2734c6b233812\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a79b80c119fdd981a0dd27ee21900932290e0490c50c0b2dd29d2a98ad98802c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67e19c92c333d196a461bbde1ad220679064aece1740b7719e95a2ad053dfe67\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:06:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:06:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:06:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.531588 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"92ceb516-b88c-44bd-b534-25ea21b31379\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d06a06eac1e9ddaa27c5d9af7350e432e3c9e0c3a9e138cefc4336991770b8e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9f6sn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-747vh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.560026 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tn44z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ec2edb56-6323-4261-8954-3e75a645ed42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:07:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93135ab35e1360b218a37e4195c1f5b752c4baf6929df153330887826b28e344\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:07:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aeb8aac0e9e8e9a85a90eac2c0177c084e353300eea476111d403a2344571f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://661fff8cf7e80c7a6ce28d6a113e26c47d84a6ce01f2c3a4dc411521fa728c43\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2f724af7c4431391f3e4577be4f3e70d2ade41fecd49cd7344d1bbffdf7362e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c4ce2958ae256646803cbf29761692e8e1503cc7321717ad353243e0416e78\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dbea7cf8cd4073b7a3299e78da631b7e9686ceb7343ecbb926a66a4a5e34b2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce56df04aa482dffdf75fe1ec7153f9a36bcb27f104640139a61264ca0222954\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-10T15:07:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-10T15:07:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr57s\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:07:48Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tn44z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:48 crc kubenswrapper[4795]: I0310 15:08:48.577508 4795 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ce2b7f29-4124-474a-9f51-18aa60a6fdfb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-10T15:08:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f0d9cdd7ab740cd19639797d1f22b9932588aacc96150d3b2d43cf730a7d3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fcd5454b4263d1225079ed6f503759129a08d972be337e339a17798d48d01c34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-10T15:08:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-82dpl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-10T15:08:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-6kmmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-10T15:08:48Z is after 2025-08-24T17:21:41Z" Mar 10 15:08:49 crc kubenswrapper[4795]: I0310 15:08:49.475876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:49 crc kubenswrapper[4795]: I0310 15:08:49.476044 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:49 crc kubenswrapper[4795]: I0310 15:08:49.476042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:49 crc kubenswrapper[4795]: E0310 15:08:49.476288 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:49 crc kubenswrapper[4795]: I0310 15:08:49.476724 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:49 crc kubenswrapper[4795]: E0310 15:08:49.476920 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:49 crc kubenswrapper[4795]: E0310 15:08:49.477064 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:49 crc kubenswrapper[4795]: E0310 15:08:49.477295 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:51 crc kubenswrapper[4795]: I0310 15:08:51.476276 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:51 crc kubenswrapper[4795]: I0310 15:08:51.476322 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:51 crc kubenswrapper[4795]: E0310 15:08:51.476477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:51 crc kubenswrapper[4795]: I0310 15:08:51.476542 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:51 crc kubenswrapper[4795]: E0310 15:08:51.476752 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:51 crc kubenswrapper[4795]: E0310 15:08:51.477020 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:51 crc kubenswrapper[4795]: I0310 15:08:51.477781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:51 crc kubenswrapper[4795]: E0310 15:08:51.478303 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:52 crc kubenswrapper[4795]: E0310 15:08:52.583604 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:53 crc kubenswrapper[4795]: I0310 15:08:53.476206 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:53 crc kubenswrapper[4795]: I0310 15:08:53.476221 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:53 crc kubenswrapper[4795]: I0310 15:08:53.476264 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:53 crc kubenswrapper[4795]: I0310 15:08:53.476286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:53 crc kubenswrapper[4795]: E0310 15:08:53.477176 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:53 crc kubenswrapper[4795]: E0310 15:08:53.477230 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:53 crc kubenswrapper[4795]: E0310 15:08:53.477307 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:53 crc kubenswrapper[4795]: E0310 15:08:53.477469 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:55 crc kubenswrapper[4795]: I0310 15:08:55.475797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:55 crc kubenswrapper[4795]: E0310 15:08:55.475999 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:55 crc kubenswrapper[4795]: I0310 15:08:55.476028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:55 crc kubenswrapper[4795]: I0310 15:08:55.476100 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:55 crc kubenswrapper[4795]: I0310 15:08:55.476129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:55 crc kubenswrapper[4795]: E0310 15:08:55.476222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:55 crc kubenswrapper[4795]: E0310 15:08:55.476346 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:55 crc kubenswrapper[4795]: E0310 15:08:55.476490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.476370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.476455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:57 crc kubenswrapper[4795]: E0310 15:08:57.476579 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.477259 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.477304 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:57 crc kubenswrapper[4795]: E0310 15:08:57.477449 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:57 crc kubenswrapper[4795]: E0310 15:08:57.477588 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:57 crc kubenswrapper[4795]: E0310 15:08:57.477722 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.532818 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=71.532788774 podStartE2EDuration="1m11.532788774s" podCreationTimestamp="2026-03-10 15:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.529768738 +0000 UTC m=+170.695509676" watchObservedRunningTime="2026-03-10 15:08:57.532788774 +0000 UTC m=+170.698529712" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.539229 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.539293 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.539315 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.539343 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.539367 4795 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-10T15:08:57Z","lastTransitionTime":"2026-03-10T15:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.554345 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podStartSLOduration=102.554321748 podStartE2EDuration="1m42.554321748s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.552726652 +0000 UTC m=+170.718467590" watchObservedRunningTime="2026-03-10 15:08:57.554321748 +0000 UTC m=+170.720062686" Mar 10 15:08:57 crc kubenswrapper[4795]: E0310 15:08:57.584224 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.587165 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tn44z" podStartSLOduration=102.587144903 podStartE2EDuration="1m42.587144903s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.586845794 +0000 UTC m=+170.752586732" watchObservedRunningTime="2026-03-10 15:08:57.587144903 +0000 UTC m=+170.752885841" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.618400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-6kmmk" podStartSLOduration=102.618369623 podStartE2EDuration="1m42.618369623s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.617182799 +0000 UTC m=+170.782923767" watchObservedRunningTime="2026-03-10 15:08:57.618369623 +0000 UTC m=+170.784110561" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.618990 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2"] Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.619561 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.624181 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.625168 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.625967 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.628614 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.649348 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=82.649331715 podStartE2EDuration="1m22.649331715s" podCreationTimestamp="2026-03-10 15:07:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.64917566 +0000 UTC m=+170.814916578" watchObservedRunningTime="2026-03-10 15:08:57.649331715 +0000 UTC m=+170.815072623" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.691345 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=34.691315961 podStartE2EDuration="34.691315961s" podCreationTimestamp="2026-03-10 15:08:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.6712734 +0000 UTC m=+170.837014338" watchObservedRunningTime="2026-03-10 15:08:57.691315961 +0000 UTC m=+170.857056889" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.699361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.699418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eb07199-a711-413c-80b0-a1ef30a8ceec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.699441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb07199-a711-413c-80b0-a1ef30a8ceec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.699468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.699496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb07199-a711-413c-80b0-a1ef30a8ceec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.709757 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v49r8" podStartSLOduration=102.709737146 podStartE2EDuration="1m42.709737146s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.69162411 +0000 UTC m=+170.857365018" watchObservedRunningTime="2026-03-10 15:08:57.709737146 +0000 UTC m=+170.875478074" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.734052 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hmw5g" podStartSLOduration=102.734024018 podStartE2EDuration="1m42.734024018s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.721497721 +0000 UTC m=+170.887238659" watchObservedRunningTime="2026-03-10 15:08:57.734024018 +0000 UTC m=+170.899764966" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.734942 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-jc8ps" podStartSLOduration=102.734929404 podStartE2EDuration="1m42.734929404s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.733168163 +0000 UTC m=+170.898909141" watchObservedRunningTime="2026-03-10 15:08:57.734929404 +0000 UTC m=+170.900670332" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.751593 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.751565908 podStartE2EDuration="1m18.751565908s" podCreationTimestamp="2026-03-10 15:07:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.751496356 +0000 UTC m=+170.917237264" watchObservedRunningTime="2026-03-10 15:08:57.751565908 +0000 UTC m=+170.917306846" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.766930 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=42.766916325 podStartE2EDuration="42.766916325s" podCreationTimestamp="2026-03-10 15:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:57.766118522 +0000 UTC m=+170.931859420" watchObservedRunningTime="2026-03-10 15:08:57.766916325 +0000 UTC m=+170.932657233" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb07199-a711-413c-80b0-a1ef30a8ceec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eb07199-a711-413c-80b0-a1ef30a8ceec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800411 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb07199-a711-413c-80b0-a1ef30a8ceec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.800616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8eb07199-a711-413c-80b0-a1ef30a8ceec-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.801390 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8eb07199-a711-413c-80b0-a1ef30a8ceec-service-ca\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.808987 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8eb07199-a711-413c-80b0-a1ef30a8ceec-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.818094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8eb07199-a711-413c-80b0-a1ef30a8ceec-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-c7ls2\" (UID: \"8eb07199-a711-413c-80b0-a1ef30a8ceec\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: I0310 15:08:57.943479 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" Mar 10 15:08:57 crc kubenswrapper[4795]: W0310 15:08:57.959602 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8eb07199_a711_413c_80b0_a1ef30a8ceec.slice/crio-5260381fc6450c27d899572203e8b2c521ddc1846b00b1f21c0c3bf418f70964 WatchSource:0}: Error finding container 5260381fc6450c27d899572203e8b2c521ddc1846b00b1f21c0c3bf418f70964: Status 404 returned error can't find the container with id 5260381fc6450c27d899572203e8b2c521ddc1846b00b1f21c0c3bf418f70964 Mar 10 15:08:58 crc kubenswrapper[4795]: I0310 15:08:58.325110 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" event={"ID":"8eb07199-a711-413c-80b0-a1ef30a8ceec","Type":"ContainerStarted","Data":"f350faaceb68a8354c3e3e207237f5375c4c4a1630e1a64be1c01207ce7aefe7"} Mar 10 15:08:58 crc kubenswrapper[4795]: I0310 15:08:58.325538 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" event={"ID":"8eb07199-a711-413c-80b0-a1ef30a8ceec","Type":"ContainerStarted","Data":"5260381fc6450c27d899572203e8b2c521ddc1846b00b1f21c0c3bf418f70964"} Mar 10 15:08:58 crc kubenswrapper[4795]: I0310 15:08:58.349138 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-c7ls2" podStartSLOduration=103.349101323 podStartE2EDuration="1m43.349101323s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:08:58.34828442 +0000 UTC m=+171.514025378" watchObservedRunningTime="2026-03-10 15:08:58.349101323 +0000 UTC m=+171.514842271" Mar 10 15:08:58 crc kubenswrapper[4795]: I0310 15:08:58.523514 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 10 15:08:58 crc kubenswrapper[4795]: I0310 15:08:58.537062 4795 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 10 15:08:59 crc kubenswrapper[4795]: I0310 15:08:59.476052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:08:59 crc kubenswrapper[4795]: I0310 15:08:59.476141 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:08:59 crc kubenswrapper[4795]: I0310 15:08:59.476251 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:08:59 crc kubenswrapper[4795]: E0310 15:08:59.476312 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:08:59 crc kubenswrapper[4795]: I0310 15:08:59.476434 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:08:59 crc kubenswrapper[4795]: E0310 15:08:59.476490 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:08:59 crc kubenswrapper[4795]: E0310 15:08:59.476633 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:08:59 crc kubenswrapper[4795]: E0310 15:08:59.476702 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:01 crc kubenswrapper[4795]: I0310 15:09:01.476463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:01 crc kubenswrapper[4795]: I0310 15:09:01.476534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:01 crc kubenswrapper[4795]: I0310 15:09:01.476465 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:01 crc kubenswrapper[4795]: I0310 15:09:01.476613 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:01 crc kubenswrapper[4795]: E0310 15:09:01.476653 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:01 crc kubenswrapper[4795]: E0310 15:09:01.476766 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:01 crc kubenswrapper[4795]: E0310 15:09:01.476864 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:01 crc kubenswrapper[4795]: E0310 15:09:01.476984 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:02 crc kubenswrapper[4795]: I0310 15:09:02.476500 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:09:02 crc kubenswrapper[4795]: E0310 15:09:02.476650 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:09:02 crc kubenswrapper[4795]: E0310 15:09:02.585603 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:03 crc kubenswrapper[4795]: I0310 15:09:03.475600 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:03 crc kubenswrapper[4795]: I0310 15:09:03.475724 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:03 crc kubenswrapper[4795]: I0310 15:09:03.475630 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:03 crc kubenswrapper[4795]: E0310 15:09:03.475773 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:03 crc kubenswrapper[4795]: I0310 15:09:03.475826 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:03 crc kubenswrapper[4795]: E0310 15:09:03.475943 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:03 crc kubenswrapper[4795]: E0310 15:09:03.476058 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:03 crc kubenswrapper[4795]: E0310 15:09:03.476203 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:05 crc kubenswrapper[4795]: I0310 15:09:05.285385 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.285617 4795 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.285705 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs podName:3036349b-f184-48aa-b5ab-de9c5c7ae511 nodeName:}" failed. No retries permitted until 2026-03-10 15:10:09.285683616 +0000 UTC m=+242.451424524 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs") pod "network-metrics-daemon-zjg2f" (UID: "3036349b-f184-48aa-b5ab-de9c5c7ae511") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 10 15:09:05 crc kubenswrapper[4795]: I0310 15:09:05.475854 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:05 crc kubenswrapper[4795]: I0310 15:09:05.475913 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:05 crc kubenswrapper[4795]: I0310 15:09:05.475870 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:05 crc kubenswrapper[4795]: I0310 15:09:05.475892 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.476056 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.476307 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.476409 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:05 crc kubenswrapper[4795]: E0310 15:09:05.476495 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:07 crc kubenswrapper[4795]: I0310 15:09:07.475624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:07 crc kubenswrapper[4795]: I0310 15:09:07.475653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:07 crc kubenswrapper[4795]: I0310 15:09:07.475698 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:07 crc kubenswrapper[4795]: I0310 15:09:07.475786 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:07 crc kubenswrapper[4795]: E0310 15:09:07.477919 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:07 crc kubenswrapper[4795]: E0310 15:09:07.478134 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:07 crc kubenswrapper[4795]: E0310 15:09:07.478285 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:07 crc kubenswrapper[4795]: E0310 15:09:07.478432 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:07 crc kubenswrapper[4795]: E0310 15:09:07.586377 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:09 crc kubenswrapper[4795]: I0310 15:09:09.475853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:09 crc kubenswrapper[4795]: I0310 15:09:09.475896 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:09 crc kubenswrapper[4795]: E0310 15:09:09.476049 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:09 crc kubenswrapper[4795]: I0310 15:09:09.476129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:09 crc kubenswrapper[4795]: E0310 15:09:09.476223 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:09 crc kubenswrapper[4795]: E0310 15:09:09.476329 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:09 crc kubenswrapper[4795]: I0310 15:09:09.476888 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:09 crc kubenswrapper[4795]: E0310 15:09:09.477121 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:11 crc kubenswrapper[4795]: I0310 15:09:11.476289 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:11 crc kubenswrapper[4795]: I0310 15:09:11.476391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:11 crc kubenswrapper[4795]: I0310 15:09:11.476494 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:11 crc kubenswrapper[4795]: E0310 15:09:11.476695 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:11 crc kubenswrapper[4795]: I0310 15:09:11.476769 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:11 crc kubenswrapper[4795]: E0310 15:09:11.477005 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:11 crc kubenswrapper[4795]: E0310 15:09:11.477170 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:11 crc kubenswrapper[4795]: E0310 15:09:11.477289 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:12 crc kubenswrapper[4795]: E0310 15:09:12.587783 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:13 crc kubenswrapper[4795]: I0310 15:09:13.475970 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:13 crc kubenswrapper[4795]: E0310 15:09:13.476184 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:13 crc kubenswrapper[4795]: I0310 15:09:13.476361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:13 crc kubenswrapper[4795]: I0310 15:09:13.476437 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:13 crc kubenswrapper[4795]: I0310 15:09:13.476361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:13 crc kubenswrapper[4795]: E0310 15:09:13.476573 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:13 crc kubenswrapper[4795]: E0310 15:09:13.476673 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:13 crc kubenswrapper[4795]: E0310 15:09:13.476772 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:14 crc kubenswrapper[4795]: I0310 15:09:14.477427 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:09:14 crc kubenswrapper[4795]: E0310 15:09:14.477710 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4q8gk_openshift-ovn-kubernetes(89b0616b-9d8b-43a5-b8c7-d9cbb4669583)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" Mar 10 15:09:15 crc kubenswrapper[4795]: I0310 15:09:15.476514 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:15 crc kubenswrapper[4795]: I0310 15:09:15.476530 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:15 crc kubenswrapper[4795]: I0310 15:09:15.476538 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:15 crc kubenswrapper[4795]: I0310 15:09:15.476571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:15 crc kubenswrapper[4795]: E0310 15:09:15.477001 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:15 crc kubenswrapper[4795]: E0310 15:09:15.477346 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:15 crc kubenswrapper[4795]: E0310 15:09:15.477433 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:15 crc kubenswrapper[4795]: E0310 15:09:15.477603 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:17 crc kubenswrapper[4795]: I0310 15:09:17.476207 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:17 crc kubenswrapper[4795]: I0310 15:09:17.478939 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:17 crc kubenswrapper[4795]: I0310 15:09:17.479051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:17 crc kubenswrapper[4795]: I0310 15:09:17.479121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:17 crc kubenswrapper[4795]: E0310 15:09:17.479261 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:17 crc kubenswrapper[4795]: E0310 15:09:17.479814 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:17 crc kubenswrapper[4795]: E0310 15:09:17.479599 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:17 crc kubenswrapper[4795]: E0310 15:09:17.480222 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:17 crc kubenswrapper[4795]: E0310 15:09:17.588508 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:19 crc kubenswrapper[4795]: I0310 15:09:19.475948 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:19 crc kubenswrapper[4795]: I0310 15:09:19.476029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:19 crc kubenswrapper[4795]: E0310 15:09:19.476215 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:19 crc kubenswrapper[4795]: E0310 15:09:19.476566 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:19 crc kubenswrapper[4795]: I0310 15:09:19.477511 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:19 crc kubenswrapper[4795]: E0310 15:09:19.477823 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:19 crc kubenswrapper[4795]: I0310 15:09:19.478040 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:19 crc kubenswrapper[4795]: E0310 15:09:19.478321 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:21 crc kubenswrapper[4795]: I0310 15:09:21.476051 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:21 crc kubenswrapper[4795]: I0310 15:09:21.476059 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:21 crc kubenswrapper[4795]: E0310 15:09:21.476649 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:21 crc kubenswrapper[4795]: I0310 15:09:21.476352 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:21 crc kubenswrapper[4795]: E0310 15:09:21.476806 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:21 crc kubenswrapper[4795]: I0310 15:09:21.476337 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:21 crc kubenswrapper[4795]: E0310 15:09:21.476969 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:21 crc kubenswrapper[4795]: E0310 15:09:21.477181 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.413975 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/1.log" Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.415274 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/0.log" Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.415358 4795 generic.go:334] "Generic (PLEG): container finished" podID="589b366f-9132-43cc-8d7a-d401d396bf06" containerID="e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6" exitCode=1 Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.415403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerDied","Data":"e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6"} Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.415456 4795 scope.go:117] "RemoveContainer" containerID="35cdafdd59083dd87ea2efd92d209e729beee766de1117e5e610d6ec19e04db6" Mar 10 15:09:22 crc kubenswrapper[4795]: I0310 15:09:22.416056 4795 scope.go:117] "RemoveContainer" containerID="e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6" Mar 10 15:09:22 crc kubenswrapper[4795]: E0310 15:09:22.416342 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-v49r8_openshift-multus(589b366f-9132-43cc-8d7a-d401d396bf06)\"" pod="openshift-multus/multus-v49r8" podUID="589b366f-9132-43cc-8d7a-d401d396bf06" Mar 10 15:09:22 crc kubenswrapper[4795]: E0310 15:09:22.589831 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:23 crc kubenswrapper[4795]: I0310 15:09:23.421629 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/1.log" Mar 10 15:09:23 crc kubenswrapper[4795]: I0310 15:09:23.475911 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:23 crc kubenswrapper[4795]: I0310 15:09:23.475957 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:23 crc kubenswrapper[4795]: E0310 15:09:23.476174 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:23 crc kubenswrapper[4795]: I0310 15:09:23.476238 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:23 crc kubenswrapper[4795]: E0310 15:09:23.476377 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:23 crc kubenswrapper[4795]: E0310 15:09:23.476504 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:23 crc kubenswrapper[4795]: I0310 15:09:23.476616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:23 crc kubenswrapper[4795]: E0310 15:09:23.476722 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:25 crc kubenswrapper[4795]: I0310 15:09:25.475582 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:25 crc kubenswrapper[4795]: I0310 15:09:25.475676 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:25 crc kubenswrapper[4795]: I0310 15:09:25.475755 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:25 crc kubenswrapper[4795]: E0310 15:09:25.475790 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:25 crc kubenswrapper[4795]: I0310 15:09:25.475834 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:25 crc kubenswrapper[4795]: E0310 15:09:25.476018 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:25 crc kubenswrapper[4795]: E0310 15:09:25.476329 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:25 crc kubenswrapper[4795]: E0310 15:09:25.476439 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:27 crc kubenswrapper[4795]: I0310 15:09:27.476534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:27 crc kubenswrapper[4795]: I0310 15:09:27.476667 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:27 crc kubenswrapper[4795]: I0310 15:09:27.476681 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:27 crc kubenswrapper[4795]: I0310 15:09:27.479043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:27 crc kubenswrapper[4795]: E0310 15:09:27.479031 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:27 crc kubenswrapper[4795]: E0310 15:09:27.479172 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:27 crc kubenswrapper[4795]: E0310 15:09:27.479315 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:27 crc kubenswrapper[4795]: E0310 15:09:27.479454 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:27 crc kubenswrapper[4795]: E0310 15:09:27.590387 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:28 crc kubenswrapper[4795]: I0310 15:09:28.477020 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.325663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zjg2f"] Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.326115 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:29 crc kubenswrapper[4795]: E0310 15:09:29.326249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.444127 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.446743 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.447618 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerStarted","Data":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.448012 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.476307 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.476307 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.476350 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:29 crc kubenswrapper[4795]: E0310 15:09:29.476564 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:29 crc kubenswrapper[4795]: E0310 15:09:29.476716 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:29 crc kubenswrapper[4795]: E0310 15:09:29.476786 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:29 crc kubenswrapper[4795]: I0310 15:09:29.485502 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podStartSLOduration=134.485474593 podStartE2EDuration="2m14.485474593s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:29.483490917 +0000 UTC m=+202.649231805" watchObservedRunningTime="2026-03-10 15:09:29.485474593 +0000 UTC m=+202.651215531" Mar 10 15:09:30 crc kubenswrapper[4795]: I0310 15:09:30.475440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:30 crc kubenswrapper[4795]: E0310 15:09:30.475787 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:31 crc kubenswrapper[4795]: I0310 15:09:31.476528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:31 crc kubenswrapper[4795]: I0310 15:09:31.476685 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:31 crc kubenswrapper[4795]: E0310 15:09:31.476802 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:31 crc kubenswrapper[4795]: E0310 15:09:31.476944 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:31 crc kubenswrapper[4795]: I0310 15:09:31.477165 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:31 crc kubenswrapper[4795]: E0310 15:09:31.477298 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:32 crc kubenswrapper[4795]: I0310 15:09:32.475853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:32 crc kubenswrapper[4795]: E0310 15:09:32.476047 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:32 crc kubenswrapper[4795]: E0310 15:09:32.592275 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:33 crc kubenswrapper[4795]: I0310 15:09:33.475974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:33 crc kubenswrapper[4795]: I0310 15:09:33.476006 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:33 crc kubenswrapper[4795]: E0310 15:09:33.476146 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:33 crc kubenswrapper[4795]: I0310 15:09:33.476269 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:33 crc kubenswrapper[4795]: E0310 15:09:33.476369 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:33 crc kubenswrapper[4795]: E0310 15:09:33.476476 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:34 crc kubenswrapper[4795]: I0310 15:09:34.476525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:34 crc kubenswrapper[4795]: E0310 15:09:34.476697 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.475866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.475944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.475877 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.476133 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.476297 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.476392 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.478631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.478689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478799 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478817 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478826 4795 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478847 4795 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478874 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-10 15:11:37.478857925 +0000 UTC m=+330.644598823 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.478916 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:11:37.478895836 +0000 UTC m=+330.644636774 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.579491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.579726 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:11:37.579695814 +0000 UTC m=+330.745436742 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.579821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:35 crc kubenswrapper[4795]: I0310 15:09:35.579864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580022 4795 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580058 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580125 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-10 15:11:37.580106475 +0000 UTC m=+330.745847413 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580134 4795 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580195 4795 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:09:35 crc kubenswrapper[4795]: E0310 15:09:35.580288 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-10 15:11:37.580261369 +0000 UTC m=+330.746002307 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 10 15:09:36 crc kubenswrapper[4795]: I0310 15:09:36.475635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:36 crc kubenswrapper[4795]: I0310 15:09:36.476236 4795 scope.go:117] "RemoveContainer" containerID="e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6" Mar 10 15:09:36 crc kubenswrapper[4795]: E0310 15:09:36.476343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:37 crc kubenswrapper[4795]: I0310 15:09:37.716563 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:37 crc kubenswrapper[4795]: I0310 15:09:37.716993 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:37 crc kubenswrapper[4795]: I0310 15:09:37.716955 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:37 crc kubenswrapper[4795]: E0310 15:09:37.718111 4795 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 10 15:09:37 crc kubenswrapper[4795]: E0310 15:09:37.718402 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:37 crc kubenswrapper[4795]: E0310 15:09:37.718865 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:37 crc kubenswrapper[4795]: E0310 15:09:37.718924 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:37 crc kubenswrapper[4795]: I0310 15:09:37.721824 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/1.log" Mar 10 15:09:37 crc kubenswrapper[4795]: I0310 15:09:37.726046 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerStarted","Data":"a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a"} Mar 10 15:09:38 crc kubenswrapper[4795]: I0310 15:09:38.475562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:38 crc kubenswrapper[4795]: E0310 15:09:38.475739 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:39 crc kubenswrapper[4795]: I0310 15:09:39.476116 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:39 crc kubenswrapper[4795]: E0310 15:09:39.476350 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:39 crc kubenswrapper[4795]: I0310 15:09:39.476456 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:39 crc kubenswrapper[4795]: I0310 15:09:39.476534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:39 crc kubenswrapper[4795]: E0310 15:09:39.476672 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:39 crc kubenswrapper[4795]: E0310 15:09:39.476751 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:40 crc kubenswrapper[4795]: I0310 15:09:40.475801 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:40 crc kubenswrapper[4795]: E0310 15:09:40.475916 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:41 crc kubenswrapper[4795]: I0310 15:09:41.476555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:41 crc kubenswrapper[4795]: I0310 15:09:41.476675 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:41 crc kubenswrapper[4795]: E0310 15:09:41.476737 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:09:41 crc kubenswrapper[4795]: E0310 15:09:41.476853 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 10 15:09:41 crc kubenswrapper[4795]: I0310 15:09:41.476900 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:41 crc kubenswrapper[4795]: E0310 15:09:41.477198 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 10 15:09:42 crc kubenswrapper[4795]: I0310 15:09:42.476099 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:42 crc kubenswrapper[4795]: E0310 15:09:42.476305 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zjg2f" podUID="3036349b-f184-48aa-b5ab-de9c5c7ae511" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.475690 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.475727 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.476178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.478430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.478490 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.478930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:09:43 crc kubenswrapper[4795]: I0310 15:09:43.479703 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:09:44 crc kubenswrapper[4795]: I0310 15:09:44.476463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:09:44 crc kubenswrapper[4795]: I0310 15:09:44.479644 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:09:44 crc kubenswrapper[4795]: I0310 15:09:44.479773 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.467005 4795 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.512472 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.513401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.517888 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.517986 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.518854 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.519277 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.519285 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.520035 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.534245 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7csjs"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.534746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.535258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.535748 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.537240 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cp7r2"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.538128 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.540128 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.540205 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.558119 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.558985 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.559104 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.574579 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.574885 4795 reflector.go:561] object-"openshift-apiserver"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.574935 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.575231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.575463 4795 reflector.go:561] object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff": failed to list *v1.Secret: secrets "openshift-apiserver-sa-dockercfg-djjff" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.575498 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-apiserver-sa-dockercfg-djjff\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-apiserver-sa-dockercfg-djjff\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.575573 4795 reflector.go:561] object-"openshift-apiserver"/"etcd-serving-ca": failed to list *v1.ConfigMap: configmaps "etcd-serving-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.575594 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-serving-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"etcd-serving-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.575703 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.575725 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576210 4795 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576246 4795 reflector.go:561] object-"openshift-apiserver"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576288 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576303 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576365 4795 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576379 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576393 4795 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576422 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576459 4795 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576473 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576511 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576522 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576573 4795 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576584 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576666 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.576690 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576706 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576903 4795 reflector.go:561] object-"openshift-apiserver"/"audit-1": failed to list *v1.ConfigMap: configmaps "audit-1" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576918 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"audit-1\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"audit-1\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.576966 4795 reflector.go:561] object-"openshift-apiserver"/"encryption-config-1": failed to list *v1.Secret: secrets "encryption-config-1" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.576980 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"encryption-config-1\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"encryption-config-1\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.577189 4795 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.577204 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.577247 4795 reflector.go:561] object-"openshift-apiserver"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.577259 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.577312 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.577749 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.577781 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.577857 4795 reflector.go:561] object-"openshift-apiserver"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.577878 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.577901 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.578044 4795 reflector.go:561] object-"openshift-apiserver"/"etcd-client": failed to list *v1.Secret: secrets "etcd-client" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.578059 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"etcd-client\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"etcd-client\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.578090 4795 reflector.go:561] object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq": failed to list *v1.Secret: secrets "oauth-apiserver-sa-dockercfg-6r2bq" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-oauth-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.578100 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-oauth-apiserver\"/\"oauth-apiserver-sa-dockercfg-6r2bq\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"oauth-apiserver-sa-dockercfg-6r2bq\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-oauth-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: W0310 15:09:48.578140 4795 reflector.go:561] object-"openshift-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-apiserver": no relationship found between node 'crc' and this object Mar 10 15:09:48 crc kubenswrapper[4795]: E0310 15:09:48.578150 4795 reflector.go:158] "Unhandled Error" err="object-\"openshift-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-apiserver\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.579527 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.584705 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.584984 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585113 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585160 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w7lpw"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585209 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585320 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585332 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585376 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585507 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585612 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585799 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.585925 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bqvzg"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586122 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586290 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586345 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.586338 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.587735 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.588161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.588198 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-99x29"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.588615 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.588618 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.589339 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.589817 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.592593 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65bbp"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.593138 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.593642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.594095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.594187 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6w8jb"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.594794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.595249 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.595680 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.596998 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.597427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.598024 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.598453 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.609579 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.610109 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.610292 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.610618 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.610887 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611162 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611791 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.611952 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612155 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612295 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612361 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612486 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612537 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612590 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612688 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612708 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612798 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612820 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612929 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.613003 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.609618 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.612298 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.613230 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.613625 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.613895 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.614117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.609726 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.609805 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.609846 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.616414 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.616630 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.621002 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.621469 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.629636 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.630021 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.630289 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.630561 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.630908 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.632804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.632808 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.633520 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.634008 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.634499 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.634830 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.634980 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.635580 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.635971 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.636356 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.640310 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.640709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.653290 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.653467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit-dir\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnpnt\" (UniqueName: \"kubernetes.io/projected/db723c0e-ac66-4ba7-a4f2-8ba208979d12-kube-api-access-hnpnt\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657479 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-policies\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c039cb4c-5391-4e99-828a-884abc3c3cf2-proxy-tls\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657601 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657645 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657689 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657770 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657808 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fl79\" (UniqueName: \"kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-service-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6c5\" (UniqueName: \"kubernetes.io/projected/3f0f577c-af8a-414f-ad38-1d1d839d472f-kube-api-access-6v6c5\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657906 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657925 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltmm\" (UniqueName: \"kubernetes.io/projected/6ca0abdf-0ae2-46bd-9b82-de007e620a36-kube-api-access-qltmm\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.657983 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca0abdf-0ae2-46bd-9b82-de007e620a36-serving-cert\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-images\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658122 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54355272-2661-4143-acf1-aa5d1c772e5d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-trusted-ca\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658175 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658209 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x549s\" (UniqueName: \"kubernetes.io/projected/46936998-cf8b-4370-b4e6-e5a8759c895e-kube-api-access-x549s\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd6dn\" (UniqueName: \"kubernetes.io/projected/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-kube-api-access-hd6dn\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658279 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdfvt\" (UniqueName: \"kubernetes.io/projected/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-kube-api-access-qdfvt\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658326 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658342 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46936998-cf8b-4370-b4e6-e5a8759c895e-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658355 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-dir\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-service-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658422 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658435 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frwpt\" (UniqueName: \"kubernetes.io/projected/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-kube-api-access-frwpt\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-serving-cert\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658483 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-auth-proxy-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djn8x\" (UniqueName: \"kubernetes.io/projected/54355272-2661-4143-acf1-aa5d1c772e5d-kube-api-access-djn8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/46936998-cf8b-4370-b4e6-e5a8759c895e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658538 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbn6l\" (UniqueName: \"kubernetes.io/projected/8c484e22-84bb-402d-89ec-5251b11ae7e3-kube-api-access-lbn6l\") pod \"downloads-7954f5f757-bqvzg\" (UID: \"8c484e22-84bb-402d-89ec-5251b11ae7e3\") " pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658568 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658593 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbt8\" (UniqueName: \"kubernetes.io/projected/05334446-f6c6-4982-9232-43c588eab91f-kube-api-access-hpbt8\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658630 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.653489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.654811 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.658641 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-image-import-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts78r\" (UniqueName: \"kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05334446-f6c6-4982-9232-43c588eab91f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659265 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659283 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-config\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-config\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659330 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-config\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659367 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05334446-f6c6-4982-9232-43c588eab91f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659381 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lgv\" (UniqueName: \"kubernetes.io/projected/03b6f1dc-43b6-489a-8990-7f4a9a33d535-kube-api-access-56lgv\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659431 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-serving-cert\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659446 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zzx\" (UniqueName: \"kubernetes.io/projected/c039cb4c-5391-4e99-828a-884abc3c3cf2-kube-api-access-q9zzx\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gclg\" (UniqueName: \"kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54355272-2661-4143-acf1-aa5d1c772e5d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.653605 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.659545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2729207c-e9fe-4e76-9c70-81c9780f8d8c-metrics-tls\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660850 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.655441 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660903 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-node-pullsecrets\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnn6r\" (UniqueName: \"kubernetes.io/projected/2729207c-e9fe-4e76-9c70-81c9780f8d8c-kube-api-access-hnn6r\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.660980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-machine-approver-tls\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.661009 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-client\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.664884 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.667491 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.668253 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.670272 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.672282 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gnrqn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.672974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.673998 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.674530 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.674877 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.675139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.675431 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.675872 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.676265 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.682374 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.683258 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.683715 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.683837 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmd69"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.684696 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.685610 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xg845"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.685975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.686197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.686540 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.686863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.687052 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552588-zdmvt"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.691391 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.691665 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.691773 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.691853 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.691931 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.692257 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.692796 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.692969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.694057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.694179 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.694515 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.695260 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.695347 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.695601 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.696244 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.696287 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.696858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.697283 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6q2qw"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.698435 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7csjs"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.698528 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.707539 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.709186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.711586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bqvzg"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.714375 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.715823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.716544 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.717452 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.718374 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.719862 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.721695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w7lpw"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.727960 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.735726 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.742476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gnrqn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.746620 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-99x29"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.748480 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.752981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.754150 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.755530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6w8jb"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.756849 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.758261 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.759418 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.760500 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmd69"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.761610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4m98\" (UniqueName: \"kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762487 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762792 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fl79\" (UniqueName: \"kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762841 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6c5\" (UniqueName: \"kubernetes.io/projected/3f0f577c-af8a-414f-ad38-1d1d839d472f-kube-api-access-6v6c5\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltmm\" (UniqueName: \"kubernetes.io/projected/6ca0abdf-0ae2-46bd-9b82-de007e620a36-kube-api-access-qltmm\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzzkp\" (UniqueName: \"kubernetes.io/projected/4dfa6c2b-23af-4e87-9117-932c416ed4d2-kube-api-access-zzzkp\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762949 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-images\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-stats-auth\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.762984 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763023 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763052 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763090 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbg7\" (UniqueName: \"kubernetes.io/projected/fe75b643-5c6f-461c-b1aa-4d73f89dad97-kube-api-access-lbbg7\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763136 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-trusted-ca\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763188 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x549s\" (UniqueName: \"kubernetes.io/projected/46936998-cf8b-4370-b4e6-e5a8759c895e-kube-api-access-x549s\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763208 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hgbv\" (UniqueName: \"kubernetes.io/projected/2207f46f-83de-4190-a784-533331d951e7-kube-api-access-5hgbv\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qct7h\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-kube-api-access-qct7h\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-dir\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763306 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-serving-cert\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763323 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frwpt\" (UniqueName: \"kubernetes.io/projected/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-kube-api-access-frwpt\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbn6l\" (UniqueName: \"kubernetes.io/projected/8c484e22-84bb-402d-89ec-5251b11ae7e3-kube-api-access-lbn6l\") pod \"downloads-7954f5f757-bqvzg\" (UID: \"8c484e22-84bb-402d-89ec-5251b11ae7e3\") " pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djn8x\" (UniqueName: \"kubernetes.io/projected/54355272-2661-4143-acf1-aa5d1c772e5d-kube-api-access-djn8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05989857-37b3-4ca7-960b-9f610bd6cd2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac01e547-3f74-475d-b3f7-6558207aa984-signing-cabundle\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763438 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts78r\" (UniqueName: \"kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597bba13-bc92-4963-a8df-6d32d47d3864-config\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05334446-f6c6-4982-9232-43c588eab91f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763505 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-srv-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-default-certificate\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763537 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7bvd\" (UniqueName: \"kubernetes.io/projected/ac01e547-3f74-475d-b3f7-6558207aa984-kube-api-access-v7bvd\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6c64\" (UniqueName: \"kubernetes.io/projected/afe203dd-0a5b-44c3-afd1-fe0452f276bc-kube-api-access-b6c64\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763579 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv5cw\" (UniqueName: \"kubernetes.io/projected/f45a1651-aa0f-44ad-9f76-4bcb22348e90-kube-api-access-mv5cw\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763613 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs6d6\" (UniqueName: \"kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763630 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-config\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763667 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-serving-cert\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763698 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zzx\" (UniqueName: \"kubernetes.io/projected/c039cb4c-5391-4e99-828a-884abc3c3cf2-kube-api-access-q9zzx\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gclg\" (UniqueName: \"kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54355272-2661-4143-acf1-aa5d1c772e5d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrxdn\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-kube-api-access-wrxdn\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2729207c-e9fe-4e76-9c70-81c9780f8d8c-metrics-tls\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-metrics-certs\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763846 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763880 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763930 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-node-pullsecrets\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnn6r\" (UniqueName: \"kubernetes.io/projected/2729207c-e9fe-4e76-9c70-81c9780f8d8c-kube-api-access-hnn6r\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763987 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f35c76bb-0875-44e1-9d13-9583c9dc29df-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.763999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-images\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764004 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764052 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-policies\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764089 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnpnt\" (UniqueName: \"kubernetes.io/projected/db723c0e-ac66-4ba7-a4f2-8ba208979d12-kube-api-access-hnpnt\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764144 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764160 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01f41b58-e54b-4f47-bd01-a11f9078087c-service-ca-bundle\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764181 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721d4d9c-e120-4721-b37f-6f2f5998153b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597bba13-bc92-4963-a8df-6d32d47d3864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c039cb4c-5391-4e99-828a-884abc3c3cf2-proxy-tls\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764261 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/597bba13-bc92-4963-a8df-6d32d47d3864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764308 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764323 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98840-fc56-457c-868a-2716e92a2d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-webhook-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-service-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbmd\" (UniqueName: \"kubernetes.io/projected/01f41b58-e54b-4f47-bd01-a11f9078087c-kube-api-access-ncbmd\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764386 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6f1773-b828-4de8-8d5f-0927c9853100-config\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca0abdf-0ae2-46bd-9b82-de007e620a36-serving-cert\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764419 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb8hs\" (UniqueName: \"kubernetes.io/projected/457e29f1-69c5-4524-a7e0-78f4944ca94d-kube-api-access-gb8hs\") pod \"migrator-59844c95c7-kxzzq\" (UID: \"457e29f1-69c5-4524-a7e0-78f4944ca94d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764473 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54355272-2661-4143-acf1-aa5d1c772e5d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjp89\" (UniqueName: \"kubernetes.io/projected/2d64a005-4a73-4a13-886f-0b072b495bcf-kube-api-access-tjp89\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd6dn\" (UniqueName: \"kubernetes.io/projected/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-kube-api-access-hd6dn\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764543 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764561 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad6f1773-b828-4de8-8d5f-0927c9853100-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764578 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdfvt\" (UniqueName: \"kubernetes.io/projected/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-kube-api-access-qdfvt\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764613 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764629 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75b643-5c6f-461c-b1aa-4d73f89dad97-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764646 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46936998-cf8b-4370-b4e6-e5a8759c895e-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764662 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-service-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764665 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-trusted-ca\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d64a005-4a73-4a13-886f-0b072b495bcf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75b643-5c6f-461c-b1aa-4d73f89dad97-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764785 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/46936998-cf8b-4370-b4e6-e5a8759c895e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-auth-proxy-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764832 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxwv\" (UniqueName: \"kubernetes.io/projected/f5954f37-be0d-4076-ae21-ba0039aeb052-kube-api-access-8kxwv\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-config\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-profile-collector-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbt8\" (UniqueName: \"kubernetes.io/projected/05334446-f6c6-4982-9232-43c588eab91f-kube-api-access-hpbt8\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.764978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-image-import-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-srv-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765049 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfa6c2b-23af-4e87-9117-932c416ed4d2-tmpfs\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e98840-fc56-457c-868a-2716e92a2d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f35c76bb-0875-44e1-9d13-9583c9dc29df-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765087 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765259 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765282 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765561 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-config\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-config\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765728 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765747 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765990 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766131 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05334446-f6c6-4982-9232-43c588eab91f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766216 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56lgv\" (UniqueName: \"kubernetes.io/projected/03b6f1dc-43b6-489a-8990-7f4a9a33d535-kube-api-access-56lgv\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766289 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6f1773-b828-4de8-8d5f-0927c9853100-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766379 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7v9s\" (UniqueName: \"kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766400 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nkkm\" (UniqueName: \"kubernetes.io/projected/05989857-37b3-4ca7-960b-9f610bd6cd2c-kube-api-access-7nkkm\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-machine-approver-tls\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac01e547-3f74-475d-b3f7-6558207aa984-signing-key\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-client\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-dir\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766841 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/46936998-cf8b-4370-b4e6-e5a8759c895e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.766957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.767650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-image-import-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.768291 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.769020 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6q2qw"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.769049 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.769059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cp7r2"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.769867 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/05334446-f6c6-4982-9232-43c588eab91f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.769902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.765585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-service-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770404 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit-dir\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770427 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05989857-37b3-4ca7-960b-9f610bd6cd2c-proxy-tls\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770463 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770484 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770624 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-zdmvt"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-serving-cert\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.770970 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-audit-policies\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771175 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-service-ca\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771266 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit-dir\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771349 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-auth-proxy-config\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771379 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771407 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-config\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.771730 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-config\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.772136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.772200 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/03b6f1dc-43b6-489a-8990-7f4a9a33d535-node-pullsecrets\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.772410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54355272-2661-4143-acf1-aa5d1c772e5d-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.772453 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.772724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.773003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c039cb4c-5391-4e99-828a-884abc3c3cf2-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.773202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.773594 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8v6fj"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.773650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6ca0abdf-0ae2-46bd-9b82-de007e620a36-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.774088 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.774499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.774571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.774998 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.775501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.775790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2729207c-e9fe-4e76-9c70-81c9780f8d8c-metrics-tls\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.776255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.776612 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65bbp"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.777928 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.778294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46936998-cf8b-4370-b4e6-e5a8759c895e-serving-cert\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.778988 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.781212 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ca0abdf-0ae2-46bd-9b82-de007e620a36-serving-cert\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.781254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-machine-approver-tls\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.781771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-serving-cert\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.782296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54355272-2661-4143-acf1-aa5d1c772e5d-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.782355 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8v6fj"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.782372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.782510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.783364 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.783542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/db723c0e-ac66-4ba7-a4f2-8ba208979d12-etcd-client\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.784051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c039cb4c-5391-4e99-828a-884abc3c3cf2-proxy-tls\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.784400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/05334446-f6c6-4982-9232-43c588eab91f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.784669 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.788538 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.789612 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.790579 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8mc4"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.791782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-hj256"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.792294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.791930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.795373 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8mc4"] Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.804513 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.815405 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.835615 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.855963 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871362 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrxdn\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-kube-api-access-wrxdn\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-metrics-certs\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871479 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f35c76bb-0875-44e1-9d13-9583c9dc29df-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01f41b58-e54b-4f47-bd01-a11f9078087c-service-ca-bundle\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597bba13-bc92-4963-a8df-6d32d47d3864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721d4d9c-e120-4721-b37f-6f2f5998153b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/597bba13-bc92-4963-a8df-6d32d47d3864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871746 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98840-fc56-457c-868a-2716e92a2d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-webhook-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbmd\" (UniqueName: \"kubernetes.io/projected/01f41b58-e54b-4f47-bd01-a11f9078087c-kube-api-access-ncbmd\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871819 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6f1773-b828-4de8-8d5f-0927c9853100-config\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871844 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb8hs\" (UniqueName: \"kubernetes.io/projected/457e29f1-69c5-4524-a7e0-78f4944ca94d-kube-api-access-gb8hs\") pod \"migrator-59844c95c7-kxzzq\" (UID: \"457e29f1-69c5-4524-a7e0-78f4944ca94d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjp89\" (UniqueName: \"kubernetes.io/projected/2d64a005-4a73-4a13-886f-0b072b495bcf-kube-api-access-tjp89\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.871978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad6f1773-b828-4de8-8d5f-0927c9853100-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75b643-5c6f-461c-b1aa-4d73f89dad97-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872060 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d64a005-4a73-4a13-886f-0b072b495bcf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75b643-5c6f-461c-b1aa-4d73f89dad97-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxwv\" (UniqueName: \"kubernetes.io/projected/f5954f37-be0d-4076-ae21-ba0039aeb052-kube-api-access-8kxwv\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872188 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-profile-collector-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-srv-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfa6c2b-23af-4e87-9117-932c416ed4d2-tmpfs\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e98840-fc56-457c-868a-2716e92a2d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872339 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f35c76bb-0875-44e1-9d13-9583c9dc29df-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872416 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872454 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872565 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6f1773-b828-4de8-8d5f-0927c9853100-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872666 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7v9s\" (UniqueName: \"kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nkkm\" (UniqueName: \"kubernetes.io/projected/05989857-37b3-4ca7-960b-9f610bd6cd2c-kube-api-access-7nkkm\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac01e547-3f74-475d-b3f7-6558207aa984-signing-key\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872758 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05989857-37b3-4ca7-960b-9f610bd6cd2c-proxy-tls\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4m98\" (UniqueName: \"kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872827 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872850 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872883 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzzkp\" (UniqueName: \"kubernetes.io/projected/4dfa6c2b-23af-4e87-9117-932c416ed4d2-kube-api-access-zzzkp\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872913 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-stats-auth\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872928 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4dfa6c2b-23af-4e87-9117-932c416ed4d2-tmpfs\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.872946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbbg7\" (UniqueName: \"kubernetes.io/projected/fe75b643-5c6f-461c-b1aa-4d73f89dad97-kube-api-access-lbbg7\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hgbv\" (UniqueName: \"kubernetes.io/projected/2207f46f-83de-4190-a784-533331d951e7-kube-api-access-5hgbv\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qct7h\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-kube-api-access-qct7h\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873120 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05989857-37b3-4ca7-960b-9f610bd6cd2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac01e547-3f74-475d-b3f7-6558207aa984-signing-cabundle\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597bba13-bc92-4963-a8df-6d32d47d3864-config\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-srv-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873296 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-default-certificate\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873323 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7bvd\" (UniqueName: \"kubernetes.io/projected/ac01e547-3f74-475d-b3f7-6558207aa984-kube-api-access-v7bvd\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6c64\" (UniqueName: \"kubernetes.io/projected/afe203dd-0a5b-44c3-afd1-fe0452f276bc-kube-api-access-b6c64\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv5cw\" (UniqueName: \"kubernetes.io/projected/f45a1651-aa0f-44ad-9f76-4bcb22348e90-kube-api-access-mv5cw\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs6d6\" (UniqueName: \"kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.873681 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.874085 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/05989857-37b3-4ca7-960b-9f610bd6cd2c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.875003 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fe75b643-5c6f-461c-b1aa-4d73f89dad97-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.875811 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.876028 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.877131 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.883092 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe75b643-5c6f-461c-b1aa-4d73f89dad97-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.895721 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.916624 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.926301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.932212 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.942967 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.953815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.957688 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.975576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:09:48 crc kubenswrapper[4795]: I0310 15:09:48.998647 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.009644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/f35c76bb-0875-44e1-9d13-9583c9dc29df-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.015754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.040243 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.042608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f35c76bb-0875-44e1-9d13-9583c9dc29df-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.055395 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.075834 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.096380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.104112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597bba13-bc92-4963-a8df-6d32d47d3864-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.115800 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.123838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597bba13-bc92-4963-a8df-6d32d47d3864-config\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.136554 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.175753 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.196664 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.206672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05989857-37b3-4ca7-960b-9f610bd6cd2c-proxy-tls\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.216565 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.236907 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.247000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/2d64a005-4a73-4a13-886f-0b072b495bcf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.256620 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.275700 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.296806 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.316394 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.335868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.356323 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.365023 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/59e98840-fc56-457c-868a-2716e92a2d54-metrics-tls\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.376463 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.386208 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-srv-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.401409 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.404644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/59e98840-fc56-457c-868a-2716e92a2d54-trusted-ca\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.416380 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.426191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-profile-collector-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.426281 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f45a1651-aa0f-44ad-9f76-4bcb22348e90-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.435896 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.455540 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.465015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-webhook-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.465206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4dfa6c2b-23af-4e87-9117-932c416ed4d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.476263 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.496211 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.515804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.526398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6f1773-b828-4de8-8d5f-0927c9853100-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.536001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.546607 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f5954f37-be0d-4076-ae21-ba0039aeb052-srv-cert\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.556078 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.563304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6f1773-b828-4de8-8d5f-0927c9853100-config\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.575603 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.595796 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.615902 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.635720 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.648382 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-default-certificate\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.656779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.667305 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-stats-auth\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.676559 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.687397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01f41b58-e54b-4f47-bd01-a11f9078087c-metrics-certs\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.695173 4795 request.go:700] Waited for 1.008665139s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/configmaps?fieldSelector=metadata.name%3Dservice-ca-bundle&limit=500&resourceVersion=0 Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.696926 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.702993 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01f41b58-e54b-4f47-bd01-a11f9078087c-service-ca-bundle\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.716762 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.736587 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.757668 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.763913 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.763999 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.263979214 +0000 UTC m=+223.429720122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.764194 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.764330 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.264297083 +0000 UTC m=+223.430038011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.765799 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.765852 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.265838966 +0000 UTC m=+223.431579874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.767506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ac01e547-3f74-475d-b3f7-6558207aa984-signing-key\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.767894 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.767940 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.267927025 +0000 UTC m=+223.433667933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.770800 4795 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.770811 4795 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.770877 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.270856518 +0000 UTC m=+223.436597496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.770999 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.270890238 +0000 UTC m=+223.436631166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771334 4795 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771406 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.271389093 +0000 UTC m=+223.437130021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771559 4795 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771629 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.271609619 +0000 UTC m=+223.437350607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771711 4795 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771818 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.271797314 +0000 UTC m=+223.437538272 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771908 4795 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771964 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.271948718 +0000 UTC m=+223.437689726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.771991 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.772027 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.27201391 +0000 UTC m=+223.437754928 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.772431 4795 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.772533 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.272491204 +0000 UTC m=+223.438232112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.773117 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.773170 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.273156062 +0000 UTC m=+223.438896970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.773964 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.774024 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.274009146 +0000 UTC m=+223.439750144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.775464 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.784402 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ac01e547-3f74-475d-b3f7-6558207aa984-signing-cabundle\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.795973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.816484 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.835598 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.857240 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.871896 4795 secret.go:188] Couldn't get secret openshift-kube-scheduler-operator/kube-scheduler-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.871997 4795 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872122 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert podName:721d4d9c-e120-4721-b37f-6f2f5998153b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.372026086 +0000 UTC m=+223.537767024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert") pod "openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" (UID: "721d4d9c-e120-4721-b37f-6f2f5998153b") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872192 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca podName:4705e450-7ac4-4741-bab8-e17cb6a79050 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.37217656 +0000 UTC m=+223.537917498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca") pod "route-controller-manager-6576b87f9c-45xss" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872303 4795 configmap.go:193] Couldn't get configMap openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872333 4795 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872358 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config podName:721d4d9c-e120-4721-b37f-6f2f5998153b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.372342695 +0000 UTC m=+223.538083623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config") pod "openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" (UID: "721d4d9c-e120-4721-b37f-6f2f5998153b") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872470 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config podName:4705e450-7ac4-4741-bab8-e17cb6a79050 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.372445278 +0000 UTC m=+223.538186196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config") pod "route-controller-manager-6576b87f9c-45xss" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872483 4795 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.872640 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config podName:2207f46f-83de-4190-a784-533331d951e7 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.372551191 +0000 UTC m=+223.538292139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config") pod "service-ca-operator-777779d784-zl4pz" (UID: "2207f46f-83de-4190-a784-533331d951e7") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873088 4795 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873141 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls podName:afe203dd-0a5b-44c3-afd1-fe0452f276bc nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.373127427 +0000 UTC m=+223.538868335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-sd26k" (UID: "afe203dd-0a5b-44c3-afd1-fe0452f276bc") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873155 4795 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873164 4795 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873256 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert podName:4705e450-7ac4-4741-bab8-e17cb6a79050 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.37322348 +0000 UTC m=+223.538964428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert") pod "route-controller-manager-6576b87f9c-45xss" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: E0310 15:09:49.873284 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert podName:2207f46f-83de-4190-a784-533331d951e7 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:50.373274381 +0000 UTC m=+223.539015289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert") pod "service-ca-operator-777779d784-zl4pz" (UID: "2207f46f-83de-4190-a784-533331d951e7") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.875669 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.896096 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.915813 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.936194 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.956706 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.977008 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:09:49 crc kubenswrapper[4795]: I0310 15:09:49.997597 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.016754 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.036190 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.056452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.075871 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.095884 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.116039 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.135487 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.156986 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.176020 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.197450 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.216041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.238004 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.256409 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.276257 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.297194 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.298267 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.298716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.298845 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.298921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299028 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299269 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299407 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.299765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.317674 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.336619 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.356503 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.376104 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.396130 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.401256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.401469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.401674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.401792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.401953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.402107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.402282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.402421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.402641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721d4d9c-e120-4721-b37f-6f2f5998153b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.403451 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.403495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2207f46f-83de-4190-a784-533331d951e7-config\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.404764 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.410270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/721d4d9c-e120-4721-b37f-6f2f5998153b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.410667 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.411276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/afe203dd-0a5b-44c3-afd1-fe0452f276bc-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.413164 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2207f46f-83de-4190-a784-533331d951e7-serving-cert\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.417032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.436470 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.526251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltmm\" (UniqueName: \"kubernetes.io/projected/6ca0abdf-0ae2-46bd-9b82-de007e620a36-kube-api-access-qltmm\") pod \"authentication-operator-69f744f599-w7lpw\" (UID: \"6ca0abdf-0ae2-46bd-9b82-de007e620a36\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.542849 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x549s\" (UniqueName: \"kubernetes.io/projected/46936998-cf8b-4370-b4e6-e5a8759c895e-kube-api-access-x549s\") pod \"openshift-config-operator-7777fb866f-nhjkm\" (UID: \"46936998-cf8b-4370-b4e6-e5a8759c895e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.579842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frwpt\" (UniqueName: \"kubernetes.io/projected/8cec803c-a30b-48aa-b9e7-7b913afcfb4a-kube-api-access-frwpt\") pod \"machine-approver-56656f9798-2slt9\" (UID: \"8cec803c-a30b-48aa-b9e7-7b913afcfb4a\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.594775 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbt8\" (UniqueName: \"kubernetes.io/projected/05334446-f6c6-4982-9232-43c588eab91f-kube-api-access-hpbt8\") pod \"openshift-apiserver-operator-796bbdcf4f-9s254\" (UID: \"05334446-f6c6-4982-9232-43c588eab91f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.624714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbn6l\" (UniqueName: \"kubernetes.io/projected/8c484e22-84bb-402d-89ec-5251b11ae7e3-kube-api-access-lbn6l\") pod \"downloads-7954f5f757-bqvzg\" (UID: \"8c484e22-84bb-402d-89ec-5251b11ae7e3\") " pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.643182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djn8x\" (UniqueName: \"kubernetes.io/projected/54355272-2661-4143-acf1-aa5d1c772e5d-kube-api-access-djn8x\") pod \"kube-storage-version-migrator-operator-b67b599dd-vrbm6\" (UID: \"54355272-2661-4143-acf1-aa5d1c772e5d\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.655275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts78r\" (UniqueName: \"kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r\") pod \"oauth-openshift-558db77b4-jd6qf\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.676152 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd6dn\" (UniqueName: \"kubernetes.io/projected/874d72ba-5718-48d5-b2a8-dd2cb5c0cc78-kube-api-access-hd6dn\") pod \"console-operator-58897d9998-99x29\" (UID: \"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78\") " pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.695231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnpnt\" (UniqueName: \"kubernetes.io/projected/db723c0e-ac66-4ba7-a4f2-8ba208979d12-kube-api-access-hnpnt\") pod \"etcd-operator-b45778765-65bbp\" (UID: \"db723c0e-ac66-4ba7-a4f2-8ba208979d12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.714902 4795 request.go:700] Waited for 1.943442286s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.718937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zzx\" (UniqueName: \"kubernetes.io/projected/c039cb4c-5391-4e99-828a-884abc3c3cf2-kube-api-access-q9zzx\") pod \"machine-config-operator-74547568cd-c5hxz\" (UID: \"c039cb4c-5391-4e99-828a-884abc3c3cf2\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.750399 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gclg\" (UniqueName: \"kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg\") pod \"controller-manager-879f6c89f-cg27q\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.771054 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnn6r\" (UniqueName: \"kubernetes.io/projected/2729207c-e9fe-4e76-9c70-81c9780f8d8c-kube-api-access-hnn6r\") pod \"dns-operator-744455d44c-6w8jb\" (UID: \"2729207c-e9fe-4e76-9c70-81c9780f8d8c\") " pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.785619 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.795868 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.797486 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.799038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdfvt\" (UniqueName: \"kubernetes.io/projected/264d5645-f3d4-4ad1-b7ad-01ef534d4a20-kube-api-access-qdfvt\") pod \"cluster-samples-operator-665b6dd947-cpg78\" (UID: \"264d5645-f3d4-4ad1-b7ad-01ef534d4a20\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.807391 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.816578 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.816885 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.824250 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.835677 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.837016 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.851919 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.855633 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.860183 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.863924 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.879456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.896209 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.909287 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.909866 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.916013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.916578 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.935645 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.956214 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.978990 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:50 crc kubenswrapper[4795]: I0310 15:09:50.999369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/721d4d9c-e120-4721-b37f-6f2f5998153b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r4d5t\" (UID: \"721d4d9c-e120-4721-b37f-6f2f5998153b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.014228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/597bba13-bc92-4963-a8df-6d32d47d3864-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-dhm9q\" (UID: \"597bba13-bc92-4963-a8df-6d32d47d3864\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.033376 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbmd\" (UniqueName: \"kubernetes.io/projected/01f41b58-e54b-4f47-bd01-a11f9078087c-kube-api-access-ncbmd\") pod \"router-default-5444994796-xg845\" (UID: \"01f41b58-e54b-4f47-bd01-a11f9078087c\") " pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.058945 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.060227 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjp89\" (UniqueName: \"kubernetes.io/projected/2d64a005-4a73-4a13-886f-0b072b495bcf-kube-api-access-tjp89\") pod \"multus-admission-controller-857f4d67dd-gnrqn\" (UID: \"2d64a005-4a73-4a13-886f-0b072b495bcf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.090136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb8hs\" (UniqueName: \"kubernetes.io/projected/457e29f1-69c5-4524-a7e0-78f4944ca94d-kube-api-access-gb8hs\") pod \"migrator-59844c95c7-kxzzq\" (UID: \"457e29f1-69c5-4524-a7e0-78f4944ca94d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.092818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad6f1773-b828-4de8-8d5f-0927c9853100-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-2k5kv\" (UID: \"ad6f1773-b828-4de8-8d5f-0927c9853100\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.109908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrxdn\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-kube-api-access-wrxdn\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.119828 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.125835 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-w7lpw"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.133837 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxwv\" (UniqueName: \"kubernetes.io/projected/f5954f37-be0d-4076-ae21-ba0039aeb052-kube-api-access-8kxwv\") pod \"catalog-operator-68c6474976-xfqrl\" (UID: \"f5954f37-be0d-4076-ae21-ba0039aeb052\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.155421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/59e98840-fc56-457c-868a-2716e92a2d54-bound-sa-token\") pod \"ingress-operator-5b745b69d9-x97l4\" (UID: \"59e98840-fc56-457c-868a-2716e92a2d54\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.170595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7v9s\" (UniqueName: \"kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s\") pod \"route-controller-manager-6576b87f9c-45xss\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.194618 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nkkm\" (UniqueName: \"kubernetes.io/projected/05989857-37b3-4ca7-960b-9f610bd6cd2c-kube-api-access-7nkkm\") pod \"machine-config-controller-84d6567774-dssjn\" (UID: \"05989857-37b3-4ca7-960b-9f610bd6cd2c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.209735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4m98\" (UniqueName: \"kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98\") pod \"console-f9d7485db-g452c\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.222301 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.236498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzzkp\" (UniqueName: \"kubernetes.io/projected/4dfa6c2b-23af-4e87-9117-932c416ed4d2-kube-api-access-zzzkp\") pod \"packageserver-d55dfcdfc-8kghz\" (UID: \"4dfa6c2b-23af-4e87-9117-932c416ed4d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.243308 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.258428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.267220 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.272195 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.278546 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbbg7\" (UniqueName: \"kubernetes.io/projected/fe75b643-5c6f-461c-b1aa-4d73f89dad97-kube-api-access-lbbg7\") pod \"openshift-controller-manager-operator-756b6f6bc6-47qdt\" (UID: \"fe75b643-5c6f-461c-b1aa-4d73f89dad97\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.290876 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.292274 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hgbv\" (UniqueName: \"kubernetes.io/projected/2207f46f-83de-4190-a784-533331d951e7-kube-api-access-5hgbv\") pod \"service-ca-operator-777779d784-zl4pz\" (UID: \"2207f46f-83de-4190-a784-533331d951e7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.297849 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.298690 4795 secret.go:188] Couldn't get secret openshift-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.298758 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.298737183 +0000 UTC m=+225.464478081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.298765 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.298826 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.298810226 +0000 UTC m=+225.464551124 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299052 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299101 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299094174 +0000 UTC m=+225.464835072 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299146 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-1: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299167 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299161765 +0000 UTC m=+225.464902663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299191 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/config: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299216 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299210937 +0000 UTC m=+225.464951835 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299341 4795 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299449 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299430663 +0000 UTC m=+225.465171561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299645 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299688 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.29967732 +0000 UTC m=+225.465418208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299705 4795 secret.go:188] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299727 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299720761 +0000 UTC m=+225.465461659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299750 4795 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299782 4795 configmap.go:193] Couldn't get configMap openshift-oauth-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299799 4795 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299808 4795 secret.go:188] Couldn't get secret openshift-oauth-apiserver/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299842 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299833924 +0000 UTC m=+225.465574822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299911 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299901176 +0000 UTC m=+225.465642174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299932 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299921977 +0000 UTC m=+225.465662995 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.299947 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.299939707 +0000 UTC m=+225.465680735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.300618 4795 secret.go:188] Couldn't get secret openshift-apiserver/etcd-client: failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.300685 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client podName:03b6f1dc-43b6-489a-8990-7f4a9a33d535 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.300662968 +0000 UTC m=+225.466403866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client") pod "apiserver-76f77b778f-cp7r2" (UID: "03b6f1dc-43b6-489a-8990-7f4a9a33d535") : failed to sync secret cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.303449 4795 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.303511 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config podName:3f0f577c-af8a-414f-ad38-1d1d839d472f nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.303496668 +0000 UTC m=+225.469237566 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config") pod "machine-api-operator-5694c8668f-7csjs" (UID: "3f0f577c-af8a-414f-ad38-1d1d839d472f") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.306375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.310883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qct7h\" (UniqueName: \"kubernetes.io/projected/f35c76bb-0875-44e1-9d13-9583c9dc29df-kube-api-access-qct7h\") pod \"cluster-image-registry-operator-dc59b4c8b-v4bxk\" (UID: \"f35c76bb-0875-44e1-9d13-9583c9dc29df\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.314413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.326408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.334464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7bvd\" (UniqueName: \"kubernetes.io/projected/ac01e547-3f74-475d-b3f7-6558207aa984-kube-api-access-v7bvd\") pod \"service-ca-9c57cc56f-xmd69\" (UID: \"ac01e547-3f74-475d-b3f7-6558207aa984\") " pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.344137 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.378388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv5cw\" (UniqueName: \"kubernetes.io/projected/f45a1651-aa0f-44ad-9f76-4bcb22348e90-kube-api-access-mv5cw\") pod \"olm-operator-6b444d44fb-dlcdg\" (UID: \"f45a1651-aa0f-44ad-9f76-4bcb22348e90\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.383187 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.397340 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.398806 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.402911 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs6d6\" (UniqueName: \"kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6\") pod \"marketplace-operator-79b997595-sqw59\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:51 crc kubenswrapper[4795]: W0310 15:09:51.409493 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46936998_cf8b_4370_b4e6_e5a8759c895e.slice/crio-407f101d43a7dd6314e851cdc244530c6a590fff03eb55f765c9517965cb13d2 WatchSource:0}: Error finding container 407f101d43a7dd6314e851cdc244530c6a590fff03eb55f765c9517965cb13d2: Status 404 returned error can't find the container with id 407f101d43a7dd6314e851cdc244530c6a590fff03eb55f765c9517965cb13d2 Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.418538 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.420809 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.437346 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.437473 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.459489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.479526 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.501583 4795 projected.go:288] Couldn't get configMap openshift-oauth-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.501619 4795 projected.go:194] Error preparing data for projected volume kube-api-access-6fl79 for pod openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv: failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.501668 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79 podName:e13b22c6-cfc8-4709-9610-040c0e80b2c4 nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.001652347 +0000 UTC m=+225.167393245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6fl79" (UniqueName: "kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79") pod "apiserver-7bbb656c7d-6slmv" (UID: "e13b22c6-cfc8-4709-9610-040c0e80b2c4") : failed to sync configmap cache: timed out waiting for the condition Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.501940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.518119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lgv\" (UniqueName: \"kubernetes.io/projected/03b6f1dc-43b6-489a-8990-7f4a9a33d535-kube-api-access-56lgv\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.524614 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.526686 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.528406 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.528656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.536654 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.536923 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.541885 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-99x29"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.551470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.555752 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.565424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.579293 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.595749 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.633369 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.635980 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.638547 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.660852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.668263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6c5\" (UniqueName: \"kubernetes.io/projected/3f0f577c-af8a-414f-ad38-1d1d839d472f-kube-api-access-6v6c5\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.669128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6w8jb"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.673034 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-65bbp"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.676469 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:09:51 crc kubenswrapper[4795]: W0310 15:09:51.679856 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod457e29f1_69c5_4524_a7e0_78f4944ca94d.slice/crio-c670344d40f5af8dd48602d5a5fd3a25ec8428dc4d02bef405bf4df38f5db542 WatchSource:0}: Error finding container c670344d40f5af8dd48602d5a5fd3a25ec8428dc4d02bef405bf4df38f5db542: Status 404 returned error can't find the container with id c670344d40f5af8dd48602d5a5fd3a25ec8428dc4d02bef405bf4df38f5db542 Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.686924 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.705784 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6c64\" (UniqueName: \"kubernetes.io/projected/afe203dd-0a5b-44c3-afd1-fe0452f276bc-kube-api-access-b6c64\") pod \"control-plane-machine-set-operator-78cbb6b69f-sd26k\" (UID: \"afe203dd-0a5b-44c3-afd1-fe0452f276bc\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.710037 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.710505 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bqvzg"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.712415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.715392 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.734247 4795 request.go:700] Waited for 1.730800639s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/secrets?fieldSelector=metadata.name%3Detcd-client&limit=500&resourceVersion=0 Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.735702 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.741269 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.756426 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.776617 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.780293 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t"] Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.790268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" event={"ID":"8cec803c-a30b-48aa-b9e7-7b913afcfb4a","Type":"ContainerStarted","Data":"0f991a3d774fb83c496fc42cde60798c3d4370d600281d8f5c859f789f3c71f7"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.790310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" event={"ID":"8cec803c-a30b-48aa-b9e7-7b913afcfb4a","Type":"ContainerStarted","Data":"a6aadb82ff9cc541bc97c673a83b50ad2968b57943abc96b070cfb749ea14662"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.790320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" event={"ID":"8cec803c-a30b-48aa-b9e7-7b913afcfb4a","Type":"ContainerStarted","Data":"a2f8d9aefc7df9c192c5436d9df385bb42f5e9e4c33f17e56229abe230a58a7a"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.792223 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" event={"ID":"46936998-cf8b-4370-b4e6-e5a8759c895e","Type":"ContainerStarted","Data":"091496a5b3ed4c4402dd44f8ab572e837954e72bc8ec653ddc7f55e8cfd55ef4"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.792246 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" event={"ID":"46936998-cf8b-4370-b4e6-e5a8759c895e","Type":"ContainerStarted","Data":"407f101d43a7dd6314e851cdc244530c6a590fff03eb55f765c9517965cb13d2"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.795214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bqvzg" event={"ID":"8c484e22-84bb-402d-89ec-5251b11ae7e3","Type":"ContainerStarted","Data":"8adb8cb3ad3d02c52848f4fec14502b4667fb13c31cb77759391dd679bb63e4b"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.796559 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.800588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" event={"ID":"457e29f1-69c5-4524-a7e0-78f4944ca94d","Type":"ContainerStarted","Data":"c670344d40f5af8dd48602d5a5fd3a25ec8428dc4d02bef405bf4df38f5db542"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.802658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" event={"ID":"2729207c-e9fe-4e76-9c70-81c9780f8d8c","Type":"ContainerStarted","Data":"0a6659625fe78cf283aba755ef333e238337422da7dd21769c61ad231c8a3352"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.804367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" event={"ID":"05334446-f6c6-4982-9232-43c588eab91f","Type":"ContainerStarted","Data":"f01a332d95bddc132331fab4c620088ebf1703476b923fe8a1d35f9cef5f5e3b"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.804391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" event={"ID":"05334446-f6c6-4982-9232-43c588eab91f","Type":"ContainerStarted","Data":"9157eb6bcef364edd1c4c36fffb82958b454a6f75e87daf65f0cb919c41019aa"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.805914 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" event={"ID":"c039cb4c-5391-4e99-828a-884abc3c3cf2","Type":"ContainerStarted","Data":"ea04736c81a2259dc4b28e5223cdd9f586ebf3b5b6e0b536d727053e27d2be19"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.807325 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" event={"ID":"676a93ca-0674-4884-991b-89518b118411","Type":"ContainerStarted","Data":"47ad4ad3e4bcfa51c34f837f8ff117d7d66181db71d88336ad57ba89f0c6d154"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.808033 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" event={"ID":"264d5645-f3d4-4ad1-b7ad-01ef534d4a20","Type":"ContainerStarted","Data":"57ffab90ec972774f83293dd7da69fc72075bc7841c4596c382ca648ac9413ce"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.808846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" event={"ID":"54355272-2661-4143-acf1-aa5d1c772e5d","Type":"ContainerStarted","Data":"4f05f47da896afe42a10b6033ff3107e6d0b7b78736efcc6264c91707718796b"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.810209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-99x29" event={"ID":"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78","Type":"ContainerStarted","Data":"5a9cbc71d79acf98aefa355646902850cbc33da6f26bf75f7e0db51ab35d39a3"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.813090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xg845" event={"ID":"01f41b58-e54b-4f47-bd01-a11f9078087c","Type":"ContainerStarted","Data":"1bf7ff2ad6d4552b0feb926f9076ff27390d92438fb7fc7f4f0e1caaba8593c3"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.813126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xg845" event={"ID":"01f41b58-e54b-4f47-bd01-a11f9078087c","Type":"ContainerStarted","Data":"9db2c6f875121d99f021732667355f11b2c9835f21754ae2c2fb795c7feabea2"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.817039 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" event={"ID":"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb","Type":"ContainerStarted","Data":"5a9adaadccc1875c14f83e09f63c16fed8449734521831b0edb92b3fa2ea4cd3"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.818830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" event={"ID":"6ca0abdf-0ae2-46bd-9b82-de007e620a36","Type":"ContainerStarted","Data":"79a1e7795b8dfca803829acc702f02ceea548115ade1caf885c0767fb373f556"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.818853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" event={"ID":"6ca0abdf-0ae2-46bd-9b82-de007e620a36","Type":"ContainerStarted","Data":"5654c5988c391223be239c6745614dba21ce3665335afe72dfc6cbac6a8694f9"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.821318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" event={"ID":"db723c0e-ac66-4ba7-a4f2-8ba208979d12","Type":"ContainerStarted","Data":"a5c45dba754cb70ab8834e257648b88f73f04e81723346909ae4cb97d0ba50fa"} Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.825324 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.838136 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.926695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927027 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8mb\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927565 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqmrb\" (UniqueName: \"kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rr26\" (UniqueName: \"kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26\") pod \"auto-csr-approver-29552588-zdmvt\" (UID: \"c868aa84-d232-4d80-bff3-d9e0aa659769\") " pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927773 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7qzl\" (UniqueName: \"kubernetes.io/projected/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-kube-api-access-m7qzl\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927829 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927855 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: I0310 15:09:51.927881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:51 crc kubenswrapper[4795]: E0310 15:09:51.929818 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.429802981 +0000 UTC m=+225.595543989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.029578 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.029804 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.529779956 +0000 UTC m=+225.695520854 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.029960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rr26\" (UniqueName: \"kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26\") pod \"auto-csr-approver-29552588-zdmvt\" (UID: \"c868aa84-d232-4d80-bff3-d9e0aa659769\") " pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030004 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-plugins-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030060 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnwtd\" (UniqueName: \"kubernetes.io/projected/3240d503-5d7b-4369-be00-56d215a796c0-kube-api-access-mnwtd\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7qzl\" (UniqueName: \"kubernetes.io/projected/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-kube-api-access-m7qzl\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030178 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-node-bootstrap-token\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030305 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/813b1cb2-7fca-4632-b1e1-a36cb988f944-metrics-tls\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slqd4\" (UniqueName: \"kubernetes.io/projected/bc9f05ef-47e0-4866-a629-1cf109bf3752-kube-api-access-slqd4\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.030547 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-socket-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.032924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.033719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.033774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-registration-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.033910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9f05ef-47e0-4866-a629-1cf109bf3752-cert\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.034012 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-csi-data-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8mb\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035106 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/813b1cb2-7fca-4632-b1e1-a36cb988f944-config-volume\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035390 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.035586 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.535571359 +0000 UTC m=+225.701312367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035855 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.035891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-mountpoint-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.036425 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.036461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh7kz\" (UniqueName: \"kubernetes.io/projected/813b1cb2-7fca-4632-b1e1-a36cb988f944-kube-api-access-sh7kz\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.036657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxm52\" (UniqueName: \"kubernetes.io/projected/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-kube-api-access-nxm52\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.036908 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqmrb\" (UniqueName: \"kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.037196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.037287 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.037357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.037572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fl79\" (UniqueName: \"kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.037620 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-certs\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.038307 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.039519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.041011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.041293 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fl79\" (UniqueName: \"kubernetes.io/projected/e13b22c6-cfc8-4709-9610-040c0e80b2c4-kube-api-access-6fl79\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.041638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.042145 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.071833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7qzl\" (UniqueName: \"kubernetes.io/projected/cbf0019e-fefb-4cec-a75f-d8984f8fb0b8-kube-api-access-m7qzl\") pod \"package-server-manager-789f6589d5-vlsbk\" (UID: \"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.093842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rr26\" (UniqueName: \"kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26\") pod \"auto-csr-approver-29552588-zdmvt\" (UID: \"c868aa84-d232-4d80-bff3-d9e0aa659769\") " pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.114096 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.136184 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8mb\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138172 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-mountpoint-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh7kz\" (UniqueName: \"kubernetes.io/projected/813b1cb2-7fca-4632-b1e1-a36cb988f944-kube-api-access-sh7kz\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxm52\" (UniqueName: \"kubernetes.io/projected/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-kube-api-access-nxm52\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-certs\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-plugins-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnwtd\" (UniqueName: \"kubernetes.io/projected/3240d503-5d7b-4369-be00-56d215a796c0-kube-api-access-mnwtd\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138664 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-node-bootstrap-token\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/813b1cb2-7fca-4632-b1e1-a36cb988f944-metrics-tls\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slqd4\" (UniqueName: \"kubernetes.io/projected/bc9f05ef-47e0-4866-a629-1cf109bf3752-kube-api-access-slqd4\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-socket-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-registration-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9f05ef-47e0-4866-a629-1cf109bf3752-cert\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138882 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-csi-data-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.138934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/813b1cb2-7fca-4632-b1e1-a36cb988f944-config-volume\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.146479 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/813b1cb2-7fca-4632-b1e1-a36cb988f944-config-volume\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.146605 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.646583944 +0000 UTC m=+225.812324842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.146664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-mountpoint-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.148317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-plugins-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.148460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-registration-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.148571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-socket-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.149761 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-csi-data-dir\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.167904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-gnrqn"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.169310 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.191938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-certs\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.192106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqmrb\" (UniqueName: \"kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb\") pod \"collect-profiles-29552580-v6pjn\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.193901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/3240d503-5d7b-4369-be00-56d215a796c0-node-bootstrap-token\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.196370 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-xmd69"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.200587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bc9f05ef-47e0-4866-a629-1cf109bf3752-cert\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.200673 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.216356 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.216769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/813b1cb2-7fca-4632-b1e1-a36cb988f944-metrics-tls\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.222456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh7kz\" (UniqueName: \"kubernetes.io/projected/813b1cb2-7fca-4632-b1e1-a36cb988f944-kube-api-access-sh7kz\") pod \"dns-default-8v6fj\" (UID: \"813b1cb2-7fca-4632-b1e1-a36cb988f944\") " pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.222938 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxm52\" (UniqueName: \"kubernetes.io/projected/e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976-kube-api-access-nxm52\") pod \"csi-hostpathplugin-h8mc4\" (UID: \"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976\") " pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.242730 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.243335 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.743323038 +0000 UTC m=+225.909063926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.243648 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.251022 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.251907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.253854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnwtd\" (UniqueName: \"kubernetes.io/projected/3240d503-5d7b-4369-be00-56d215a796c0-kube-api-access-mnwtd\") pod \"machine-config-server-hj256\" (UID: \"3240d503-5d7b-4369-be00-56d215a796c0\") " pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.254213 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.255106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slqd4\" (UniqueName: \"kubernetes.io/projected/bc9f05ef-47e0-4866-a629-1cf109bf3752-kube-api-access-slqd4\") pod \"ingress-canary-6q2qw\" (UID: \"bc9f05ef-47e0-4866-a629-1cf109bf3752\") " pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.257308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.258027 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.260137 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.263257 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.264827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.266058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.267322 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.274187 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.309508 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.326882 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.330178 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.330224 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344708 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344760 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344931 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.344956 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.345016 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.345035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.345078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.345685 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.345758 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.845744322 +0000 UTC m=+226.011485220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.347505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-serving-ca\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.348128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-config\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.348700 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-trusted-ca-bundle\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.348727 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e13b22c6-cfc8-4709-9610-040c0e80b2c4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.348961 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/3f0f577c-af8a-414f-ad38-1d1d839d472f-images\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.349303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/03b6f1dc-43b6-489a-8990-7f4a9a33d535-audit\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.349393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6q2qw" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.352817 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-encryption-config\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.353075 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-serving-cert\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.353093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-serving-cert\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.353756 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f0f577c-af8a-414f-ad38-1d1d839d472f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7csjs\" (UID: \"3f0f577c-af8a-414f-ad38-1d1d839d472f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.355245 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-encryption-config\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.355707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/03b6f1dc-43b6-489a-8990-7f4a9a33d535-etcd-client\") pod \"apiserver-76f77b778f-cp7r2\" (UID: \"03b6f1dc-43b6-489a-8990-7f4a9a33d535\") " pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.356471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.357257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e13b22c6-cfc8-4709-9610-040c0e80b2c4-etcd-client\") pod \"apiserver-7bbb656c7d-6slmv\" (UID: \"e13b22c6-cfc8-4709-9610-040c0e80b2c4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.364444 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-hj256" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.370246 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.392386 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.445979 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.446548 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:52.94653291 +0000 UTC m=+226.112273808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: W0310 15:09:52.460547 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06a76ed_56f2_47d3_a6f0_3f1f889e77a9.slice/crio-9343ea30c764d2147fedb9d0ef2907d3e46be8d61997c70e23524d6b862821a3 WatchSource:0}: Error finding container 9343ea30c764d2147fedb9d0ef2907d3e46be8d61997c70e23524d6b862821a3: Status 404 returned error can't find the container with id 9343ea30c764d2147fedb9d0ef2907d3e46be8d61997c70e23524d6b862821a3 Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.514856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.547570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.549163 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.049132548 +0000 UTC m=+226.214873446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.553674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.554027 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.054013036 +0000 UTC m=+226.219753934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.559726 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.574323 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.645748 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xg845" podStartSLOduration=157.645731438 podStartE2EDuration="2m37.645731438s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:52.644990107 +0000 UTC m=+225.810731005" watchObservedRunningTime="2026-03-10 15:09:52.645731438 +0000 UTC m=+225.811472336" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.657795 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.657993 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.157970912 +0000 UTC m=+226.323711820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.658105 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.658350 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.158343193 +0000 UTC m=+226.324084091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.726972 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-w7lpw" podStartSLOduration=157.726957725 podStartE2EDuration="2m37.726957725s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:52.724919097 +0000 UTC m=+225.890659995" watchObservedRunningTime="2026-03-10 15:09:52.726957725 +0000 UTC m=+225.892698623" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.727492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.747239 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-zdmvt"] Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.757811 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54802: no serving certificate available for the kubelet" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.764028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.765677 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.265650104 +0000 UTC m=+226.431391002 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.766800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.767445 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.267435384 +0000 UTC m=+226.433176282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.835089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g452c" event={"ID":"a8e49156-de09-480a-933b-6815cde0b311","Type":"ContainerStarted","Data":"610b29f6ba39af590af62b9626e010b61dd5c359b3890c33218ea0eb7f5527bb"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.850848 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54812: no serving certificate available for the kubelet" Mar 10 15:09:52 crc kubenswrapper[4795]: W0310 15:09:52.851328 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b6c8d10_50e2_458e_a3fd_0b67c039c705.slice/crio-81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0 WatchSource:0}: Error finding container 81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0: Status 404 returned error can't find the container with id 81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0 Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.852620 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" event={"ID":"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb","Type":"ContainerStarted","Data":"496692d33a551c0861b47dca7a221cd22241563427cc3b4d7b89f88fe3a358d4"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.853123 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:52 crc kubenswrapper[4795]: W0310 15:09:52.856877 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc868aa84_d232_4d80_bff3_d9e0aa659769.slice/crio-e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e WatchSource:0}: Error finding container e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e: Status 404 returned error can't find the container with id e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.858830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" event={"ID":"54355272-2661-4143-acf1-aa5d1c772e5d","Type":"ContainerStarted","Data":"10eee4444e605f26ed1243d75bca95235e09d34e5feb77133160b9784ed95cd6"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.871420 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.871629 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.871642 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.371613047 +0000 UTC m=+226.537353945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.872863 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.871722 4795 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-jd6qf container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.873113 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.873545 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.373535442 +0000 UTC m=+226.539276340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.874361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" event={"ID":"2d64a005-4a73-4a13-886f-0b072b495bcf","Type":"ContainerStarted","Data":"936bab1686255f8b45b2820a0ae33af2cc1eef5c9de5ab6fd119304e4ff684da"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.887337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" event={"ID":"f45a1651-aa0f-44ad-9f76-4bcb22348e90","Type":"ContainerStarted","Data":"c7b43902b27ed5a200beb2e62c46c24f5de66824ae1fbf4895b0f9ef57572bcc"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.915501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerStarted","Data":"9343ea30c764d2147fedb9d0ef2907d3e46be8d61997c70e23524d6b862821a3"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.922216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" event={"ID":"676a93ca-0674-4884-991b-89518b118411","Type":"ContainerStarted","Data":"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.923049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.927516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-99x29" event={"ID":"874d72ba-5718-48d5-b2a8-dd2cb5c0cc78","Type":"ContainerStarted","Data":"6608ea33ea9b48c224cc81746b50a92b8d5b7a848f101990e0059bb9d02e89e2"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.928921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.932451 4795 patch_prober.go:28] interesting pod/console-operator-58897d9998-99x29 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.932492 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-99x29" podUID="874d72ba-5718-48d5-b2a8-dd2cb5c0cc78" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.932518 4795 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-cg27q container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.932561 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" podUID="676a93ca-0674-4884-991b-89518b118411" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.933603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" event={"ID":"f35c76bb-0875-44e1-9d13-9583c9dc29df","Type":"ContainerStarted","Data":"0e02aa397025ff32eff5b8f3c6659bb19134b61ade0aa33cb7b5032d204a0132"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.944998 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" event={"ID":"db723c0e-ac66-4ba7-a4f2-8ba208979d12","Type":"ContainerStarted","Data":"eb631209430ca3ae99eb194f964a10b23292f5939e42c009521faf4bc14f01a9"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.948696 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" event={"ID":"4705e450-7ac4-4741-bab8-e17cb6a79050","Type":"ContainerStarted","Data":"fc6fdd390c61b16ea022ce339c56c802e61d6625c612425a875dbbe323b0457b"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.949689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" event={"ID":"4dfa6c2b-23af-4e87-9117-932c416ed4d2","Type":"ContainerStarted","Data":"5445fb7730b449a435ff2d81970877bc0bcc1e158b7efa6f737370a31500ab6d"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.950384 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" event={"ID":"59e98840-fc56-457c-868a-2716e92a2d54","Type":"ContainerStarted","Data":"af170daea3a2eafa5c250a6e02a93173218ae315ba2541e6af6301818cb2f19f"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.952082 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54822: no serving certificate available for the kubelet" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.954432 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" event={"ID":"264d5645-f3d4-4ad1-b7ad-01ef534d4a20","Type":"ContainerStarted","Data":"9ad148999e0eb99ded887e19c228182dd1eb7922ebb7e3faae6ae85d154849a7"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.970852 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2slt9" podStartSLOduration=157.970835331 podStartE2EDuration="2m37.970835331s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:52.970598215 +0000 UTC m=+226.136339123" watchObservedRunningTime="2026-03-10 15:09:52.970835331 +0000 UTC m=+226.136576229" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.975922 4795 generic.go:334] "Generic (PLEG): container finished" podID="46936998-cf8b-4370-b4e6-e5a8759c895e" containerID="091496a5b3ed4c4402dd44f8ab572e837954e72bc8ec653ddc7f55e8cfd55ef4" exitCode=0 Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.976024 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" event={"ID":"46936998-cf8b-4370-b4e6-e5a8759c895e","Type":"ContainerDied","Data":"091496a5b3ed4c4402dd44f8ab572e837954e72bc8ec653ddc7f55e8cfd55ef4"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.985512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" event={"ID":"05989857-37b3-4ca7-960b-9f610bd6cd2c","Type":"ContainerStarted","Data":"f9e330d04aa3fb8914f4f038b3dbf49af71493fe4ae556b53c7fa281ab03d4b6"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.994385 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bqvzg" event={"ID":"8c484e22-84bb-402d-89ec-5251b11ae7e3","Type":"ContainerStarted","Data":"cdf03789c1385fab9f684581bcd74bcf400e4622ce65c167a12317b67eb32592"} Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.994988 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:52 crc kubenswrapper[4795]: E0310 15:09:52.995571 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.495554127 +0000 UTC m=+226.661295025 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.996470 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.998803 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-bqvzg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 10 15:09:52 crc kubenswrapper[4795]: I0310 15:09:52.998862 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bqvzg" podUID="8c484e22-84bb-402d-89ec-5251b11ae7e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.024599 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" event={"ID":"ad6f1773-b828-4de8-8d5f-0927c9853100","Type":"ContainerStarted","Data":"0a0c5ff3d13cd08cfb6f385381f9eff73c8dd13d5c9ab4993b412c959e723687"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.029556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" event={"ID":"2207f46f-83de-4190-a784-533331d951e7","Type":"ContainerStarted","Data":"532ccbd4e7894ac2771927363e9e203610ca2e9252c28a49ae9487972c4bb2ae"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.033810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" event={"ID":"ac01e547-3f74-475d-b3f7-6558207aa984","Type":"ContainerStarted","Data":"c21f755503d22842dbc9c63cbc0696fc0bf34289f996f34d7070aa9deb328c29"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.058088 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" event={"ID":"afe203dd-0a5b-44c3-afd1-fe0452f276bc","Type":"ContainerStarted","Data":"241b0a95746a10b9add503dcf652f88c65fe4a13f26bcffe41ac8f3980c9d06e"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.077144 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" event={"ID":"457e29f1-69c5-4524-a7e0-78f4944ca94d","Type":"ContainerStarted","Data":"052b5bb41f448750d4c8f048c976d703b4aa1cc14ab0b31afcf2165f1b4bb829"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.084299 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" event={"ID":"c039cb4c-5391-4e99-828a-884abc3c3cf2","Type":"ContainerStarted","Data":"f1bde69136c36aadc727ed8868db2262fbfbcf2ec5b79590ebe2516f469ed862"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.084331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" event={"ID":"c039cb4c-5391-4e99-828a-884abc3c3cf2","Type":"ContainerStarted","Data":"abd285d7bccf1ba7207954a9d14de68d77e64317200ac4302c80abcc7d201ae7"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.109276 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.110557 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.610541645 +0000 UTC m=+226.776282543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.116866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" event={"ID":"2729207c-e9fe-4e76-9c70-81c9780f8d8c","Type":"ContainerStarted","Data":"f73e58d290f4d804c3820b87dd7d3d3e0e4655b4a5fc6141f3da3b9fa9a4862f"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.136104 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" event={"ID":"721d4d9c-e120-4721-b37f-6f2f5998153b","Type":"ContainerStarted","Data":"5498725b483f53390b4ef4b5e2dd12f72379b8a74b3722241de2fe92126629e4"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.142631 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54828: no serving certificate available for the kubelet" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149787 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" event={"ID":"721d4d9c-e120-4721-b37f-6f2f5998153b","Type":"ContainerStarted","Data":"d5600d093d5e33b68279bc901e02424fffd5379001d345e94bada71f8a96bdb7"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" event={"ID":"f5954f37-be0d-4076-ae21-ba0039aeb052","Type":"ContainerStarted","Data":"3f68c2adb35598598d64721c81a5562ca95e28898d52542ecb6bf95a06f283d2"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" event={"ID":"fe75b643-5c6f-461c-b1aa-4d73f89dad97","Type":"ContainerStarted","Data":"2345784711e61f3f41db438ca942e1327ad25f92424f05cd139870fae6722f5b"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.146264 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xfqrl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149842 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" event={"ID":"597bba13-bc92-4963-a8df-6d32d47d3864","Type":"ContainerStarted","Data":"82d4d329a6b91a4e758cc45ac34d21cd9ede5162eb69728bcb49aa5dc31cfafc"} Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.149862 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" podUID="f5954f37-be0d-4076-ae21-ba0039aeb052" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.212758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.213105 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.713089822 +0000 UTC m=+226.878830720 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.232624 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54834: no serving certificate available for the kubelet" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.315531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.318903 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.818888091 +0000 UTC m=+226.984628989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.325808 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54840: no serving certificate available for the kubelet" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.346156 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:53 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:53 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:53 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.346209 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.366257 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9s254" podStartSLOduration=158.366242974 podStartE2EDuration="2m38.366242974s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.36540803 +0000 UTC m=+226.531148928" watchObservedRunningTime="2026-03-10 15:09:53.366242974 +0000 UTC m=+226.531983872" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.416616 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.428757 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54852: no serving certificate available for the kubelet" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.429290 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.929253788 +0000 UTC m=+227.094994686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.429478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.429795 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:53.929787353 +0000 UTC m=+227.095528251 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.536170 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.536780 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.036763055 +0000 UTC m=+227.202503953 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.546018 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54858: no serving certificate available for the kubelet" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.639773 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.641956 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.141942836 +0000 UTC m=+227.307683734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.707733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8v6fj"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.740813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.740916 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-65bbp" podStartSLOduration=158.740907023 podStartE2EDuration="2m38.740907023s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.740290545 +0000 UTC m=+226.906031443" watchObservedRunningTime="2026-03-10 15:09:53.740907023 +0000 UTC m=+226.906647921" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.741293 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.241278333 +0000 UTC m=+227.407019231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.769366 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-cp7r2"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.833299 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" podStartSLOduration=158.833279453 podStartE2EDuration="2m38.833279453s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.818312462 +0000 UTC m=+226.984053360" watchObservedRunningTime="2026-03-10 15:09:53.833279453 +0000 UTC m=+226.999020351" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.835274 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6q2qw"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.842596 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.847663 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" podStartSLOduration=158.847649318 podStartE2EDuration="2m38.847649318s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.84737555 +0000 UTC m=+227.013116448" watchObservedRunningTime="2026-03-10 15:09:53.847649318 +0000 UTC m=+227.013390216" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.851958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.852250 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.352237547 +0000 UTC m=+227.517978445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.853235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h8mc4"] Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.868591 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-99x29" podStartSLOduration=158.868574037 podStartE2EDuration="2m38.868574037s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.866442687 +0000 UTC m=+227.032183585" watchObservedRunningTime="2026-03-10 15:09:53.868574037 +0000 UTC m=+227.034314935" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.905129 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c5hxz" podStartSLOduration=158.905113666 podStartE2EDuration="2m38.905113666s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.903273444 +0000 UTC m=+227.069014342" watchObservedRunningTime="2026-03-10 15:09:53.905113666 +0000 UTC m=+227.070854564" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.935350 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bqvzg" podStartSLOduration=158.935328776 podStartE2EDuration="2m38.935328776s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:53.934674008 +0000 UTC m=+227.100414906" watchObservedRunningTime="2026-03-10 15:09:53.935328776 +0000 UTC m=+227.101069664" Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.953567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:53 crc kubenswrapper[4795]: E0310 15:09:53.953971 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.45395435 +0000 UTC m=+227.619695238 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:53 crc kubenswrapper[4795]: I0310 15:09:53.988334 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7csjs"] Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.048447 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" podStartSLOduration=159.048419249 podStartE2EDuration="2m39.048419249s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.02039596 +0000 UTC m=+227.186136858" watchObservedRunningTime="2026-03-10 15:09:54.048419249 +0000 UTC m=+227.214160147" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.049862 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" podStartSLOduration=159.0498566 podStartE2EDuration="2m39.0498566s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.048258385 +0000 UTC m=+227.213999283" watchObservedRunningTime="2026-03-10 15:09:54.0498566 +0000 UTC m=+227.215597488" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.054592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.054940 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.554929333 +0000 UTC m=+227.720670231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.104663 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-vrbm6" podStartSLOduration=159.104643783 podStartE2EDuration="2m39.104643783s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.104633272 +0000 UTC m=+227.270374190" watchObservedRunningTime="2026-03-10 15:09:54.104643783 +0000 UTC m=+227.270384681" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.127205 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r4d5t" podStartSLOduration=159.127186617 podStartE2EDuration="2m39.127186617s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.126690613 +0000 UTC m=+227.292431511" watchObservedRunningTime="2026-03-10 15:09:54.127186617 +0000 UTC m=+227.292927505" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.171740 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.172050 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.67203464 +0000 UTC m=+227.837775528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.207723 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8v6fj" event={"ID":"813b1cb2-7fca-4632-b1e1-a36cb988f944","Type":"ContainerStarted","Data":"4bf2445135841d2fa62a0b530318f7ae4a7061a663e261062279920fc8db9e4d"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.211536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" event={"ID":"f5954f37-be0d-4076-ae21-ba0039aeb052","Type":"ContainerStarted","Data":"447f5883860b7778db3b4b2a6e88199651bcb0822e65e1946347b1fd3c0c9d79"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.212590 4795 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-xfqrl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.212617 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" podUID="f5954f37-be0d-4076-ae21-ba0039aeb052" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.217521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g452c" event={"ID":"a8e49156-de09-480a-933b-6815cde0b311","Type":"ContainerStarted","Data":"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.237375 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54860: no serving certificate available for the kubelet" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.238056 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-g452c" podStartSLOduration=159.238042688 podStartE2EDuration="2m39.238042688s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.236211417 +0000 UTC m=+227.401952315" watchObservedRunningTime="2026-03-10 15:09:54.238042688 +0000 UTC m=+227.403783586" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.241162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" event={"ID":"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8","Type":"ContainerStarted","Data":"3a487f791add96dbd289358bc8dd4142457b1c9fc44655a657c898661895fd65"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.241203 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" event={"ID":"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8","Type":"ContainerStarted","Data":"b54029f05d66aa7008e703754a088a2663a80b7874d025cc5cfec05ebb1cf986"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.248426 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" event={"ID":"03b6f1dc-43b6-489a-8990-7f4a9a33d535","Type":"ContainerStarted","Data":"9dab120fb894e82ec5e4074aae8e1d41624fb24107ffebdfb46efac2b3be7ff4"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.250045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" event={"ID":"c868aa84-d232-4d80-bff3-d9e0aa659769","Type":"ContainerStarted","Data":"e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.264681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" event={"ID":"ac01e547-3f74-475d-b3f7-6558207aa984","Type":"ContainerStarted","Data":"806684d5fc0411743974952127f4a044ada09d05b7d7c59e50040e7ef15317c5"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.271604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" event={"ID":"4dfa6c2b-23af-4e87-9117-932c416ed4d2","Type":"ContainerStarted","Data":"ce92b51076ecd00557c13ae9f190534f0b9246df6c652da4436bc5acd2f6606e"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.272401 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.272848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.273983 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.77396725 +0000 UTC m=+227.939708148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.274275 4795 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-8kghz container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.274301 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" podUID="4dfa6c2b-23af-4e87-9117-932c416ed4d2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.18:5443/healthz\": dial tcp 10.217.0.18:5443: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.286185 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" event={"ID":"59e98840-fc56-457c-868a-2716e92a2d54","Type":"ContainerStarted","Data":"932060cdfde46d5cb8852313513fc9fea5d1862443a27625b8043e5e1726946b"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.286239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" event={"ID":"59e98840-fc56-457c-868a-2716e92a2d54","Type":"ContainerStarted","Data":"447a6863893b45001e0b297d1caab4728e514d845045a560d8c7f063ff735607"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.290177 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-xmd69" podStartSLOduration=159.290160706 podStartE2EDuration="2m39.290160706s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.289853507 +0000 UTC m=+227.455594405" watchObservedRunningTime="2026-03-10 15:09:54.290160706 +0000 UTC m=+227.455901604" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.312283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" event={"ID":"fe75b643-5c6f-461c-b1aa-4d73f89dad97","Type":"ContainerStarted","Data":"fce0179061c6872e0321440421fcc50c31e4b7ab72cd7e300757219ea062a445"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.328404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" event={"ID":"4705e450-7ac4-4741-bab8-e17cb6a79050","Type":"ContainerStarted","Data":"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.329516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.331490 4795 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-45xss container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.331535 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.340262 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:54 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:54 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:54 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.340314 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.369518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" event={"ID":"264d5645-f3d4-4ad1-b7ad-01ef534d4a20","Type":"ContainerStarted","Data":"82658ae6e4c7befb6f8a87f77d8e1cba287eb4f2cfcc8905efbfdfbf80b834b9"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.376970 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.378095 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.87805072 +0000 UTC m=+228.043791608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.382508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" event={"ID":"2729207c-e9fe-4e76-9c70-81c9780f8d8c","Type":"ContainerStarted","Data":"e7a4399812a668d5c87a4889144e7c4886aeebb1e96e150b766e009827d4f4c3"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.394478 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" podStartSLOduration=159.394456722 podStartE2EDuration="2m39.394456722s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.35744009 +0000 UTC m=+227.523180988" watchObservedRunningTime="2026-03-10 15:09:54.394456722 +0000 UTC m=+227.560197620" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.395936 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-x97l4" podStartSLOduration=159.395925414 podStartE2EDuration="2m39.395925414s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.393713741 +0000 UTC m=+227.559454639" watchObservedRunningTime="2026-03-10 15:09:54.395925414 +0000 UTC m=+227.561666312" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.399540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" event={"ID":"3f0f577c-af8a-414f-ad38-1d1d839d472f","Type":"ContainerStarted","Data":"3d7de0e5a40ed76f0ec11f8065ab82ae3b64c4919dd687e92433c30c45de8bbe"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.429273 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" podStartSLOduration=159.429251842 podStartE2EDuration="2m39.429251842s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.424453877 +0000 UTC m=+227.590194775" watchObservedRunningTime="2026-03-10 15:09:54.429251842 +0000 UTC m=+227.594992740" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.431089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" event={"ID":"05989857-37b3-4ca7-960b-9f610bd6cd2c","Type":"ContainerStarted","Data":"2bb90f788c1d776860cb66856c2b713a8e3ae47c924ad8cfaa77046e57b075b3"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.431120 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" event={"ID":"05989857-37b3-4ca7-960b-9f610bd6cd2c","Type":"ContainerStarted","Data":"ea1676a004f0006592d08ff218ffaf40730a757f21991ce0880020f63e0ebe2e"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.445151 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" event={"ID":"2207f46f-83de-4190-a784-533331d951e7","Type":"ContainerStarted","Data":"d445e62ff91109b621416224d6d76b58cc29c2756bfcafd4f71685ea4d042548"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.454757 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-cpg78" podStartSLOduration=159.454740999 podStartE2EDuration="2m39.454740999s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.451100437 +0000 UTC m=+227.616841335" watchObservedRunningTime="2026-03-10 15:09:54.454740999 +0000 UTC m=+227.620481897" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.466256 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" event={"ID":"2d64a005-4a73-4a13-886f-0b072b495bcf","Type":"ContainerStarted","Data":"59ac864362864ed769496f9790a5b1e611472374b5fff52a36fb0eacbb7056d3"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.466301 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" event={"ID":"2d64a005-4a73-4a13-886f-0b072b495bcf","Type":"ContainerStarted","Data":"aab399dc141e2c02b26286e0a2d0e754682b4e32d16e5ad480d7db0af517385f"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.482799 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-47qdt" podStartSLOduration=159.482782829 podStartE2EDuration="2m39.482782829s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.480718521 +0000 UTC m=+227.646459419" watchObservedRunningTime="2026-03-10 15:09:54.482782829 +0000 UTC m=+227.648523727" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.484326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.486493 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:54.986478613 +0000 UTC m=+228.152219511 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.491402 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-sd26k" event={"ID":"afe203dd-0a5b-44c3-afd1-fe0452f276bc","Type":"ContainerStarted","Data":"7e0357a1712f61c2cb4d482fc6fa0ceff0aed5c4c7c7d28eba09f755977456b2"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.499485 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" event={"ID":"f45a1651-aa0f-44ad-9f76-4bcb22348e90","Type":"ContainerStarted","Data":"6443fcb2274fed3c51ec50d3d0509de22e916e990914ddda08b3c8c7d34c904f"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.500675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.508814 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6w8jb" podStartSLOduration=159.508798551 podStartE2EDuration="2m39.508798551s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.507257138 +0000 UTC m=+227.672998036" watchObservedRunningTime="2026-03-10 15:09:54.508798551 +0000 UTC m=+227.674539449" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.510172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hj256" event={"ID":"3240d503-5d7b-4369-be00-56d215a796c0","Type":"ContainerStarted","Data":"f98a6629a384f206257bfca14194a5385f8898c9f794add7a7cd90dd8f6c378d"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.510226 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-hj256" event={"ID":"3240d503-5d7b-4369-be00-56d215a796c0","Type":"ContainerStarted","Data":"4b775ccde75073e2e4acacda4b9bb80af617e2ac7e59ae67a1a27af8ea9060e5"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.512019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" event={"ID":"e13b22c6-cfc8-4709-9610-040c0e80b2c4","Type":"ContainerStarted","Data":"bcdb3cd49558c615c34ac71651c057bd9f680d5d33cebbb39c75d443bed926d1"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.512799 4795 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dlcdg container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.512827 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" podUID="f45a1651-aa0f-44ad-9f76-4bcb22348e90" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.515504 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" event={"ID":"46936998-cf8b-4370-b4e6-e5a8759c895e","Type":"ContainerStarted","Data":"8b9f6757f364cee1406f1f2212a3f0e35950ff1b000fe1568a4021a032aac0f3"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.516296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.535315 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zl4pz" podStartSLOduration=159.535294137 podStartE2EDuration="2m39.535294137s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.527740005 +0000 UTC m=+227.693480893" watchObservedRunningTime="2026-03-10 15:09:54.535294137 +0000 UTC m=+227.701035035" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.535941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerStarted","Data":"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.536266 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.550528 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqw59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.550581 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.551407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" event={"ID":"6b6c8d10-50e2-458e-a3fd-0b67c039c705","Type":"ContainerStarted","Data":"0d6d43abbd7b03d20bfd1f879854e7939f893d7aea104c9c2362ee9fecc27326"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.551446 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" event={"ID":"6b6c8d10-50e2-458e-a3fd-0b67c039c705","Type":"ContainerStarted","Data":"81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.558473 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-gnrqn" podStartSLOduration=159.55845542 podStartE2EDuration="2m39.55845542s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.558409718 +0000 UTC m=+227.724150616" watchObservedRunningTime="2026-03-10 15:09:54.55845542 +0000 UTC m=+227.724196318" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.559075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" event={"ID":"ad6f1773-b828-4de8-8d5f-0927c9853100","Type":"ContainerStarted","Data":"7242070f9bc853e95b11f3225ce771615a3b341a4cc25b3f792478f73a7d4292"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.575361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" event={"ID":"597bba13-bc92-4963-a8df-6d32d47d3864","Type":"ContainerStarted","Data":"6b5d3348d087fcb6a8eab3f912bdca2941a343f4bc30bb50035ed6ea11a8a8b1"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.593726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.595250 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.095212574 +0000 UTC m=+228.260953472 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.596026 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" event={"ID":"f35c76bb-0875-44e1-9d13-9583c9dc29df","Type":"ContainerStarted","Data":"70b3f22eb52ccc5843e7443b564ec691ba29786c503c0d15795971ee69d3225e"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.614977 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-dssjn" podStartSLOduration=159.6149596 podStartE2EDuration="2m39.6149596s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.612987525 +0000 UTC m=+227.778728423" watchObservedRunningTime="2026-03-10 15:09:54.6149596 +0000 UTC m=+227.780700498" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.615994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6q2qw" event={"ID":"bc9f05ef-47e0-4866-a629-1cf109bf3752","Type":"ContainerStarted","Data":"cc0e758d43a197c99319144d12e54adc70a289983cf180b56b49e1c6734ff53a"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.615841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" podStartSLOduration=159.615835215 podStartE2EDuration="2m39.615835215s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.580567372 +0000 UTC m=+227.746308270" watchObservedRunningTime="2026-03-10 15:09:54.615835215 +0000 UTC m=+227.781576113" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.640367 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" event={"ID":"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976","Type":"ContainerStarted","Data":"88b9a6a3f48887c85b6b4172b11115da122792b12d9fe46d203fed45fa447ab5"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.649817 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" event={"ID":"457e29f1-69c5-4524-a7e0-78f4944ca94d","Type":"ContainerStarted","Data":"22bce09f2b7fc885132d31329c5bf491ff31dc34e2f94dedc399ffa852b34803"} Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.662615 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-bqvzg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.662664 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bqvzg" podUID="8c484e22-84bb-402d-89ec-5251b11ae7e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.677187 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.677605 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-99x29" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.683297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.695240 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.698489 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.198476502 +0000 UTC m=+228.364217400 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.707276 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-dhm9q" podStartSLOduration=159.707256759 podStartE2EDuration="2m39.707256759s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.653242888 +0000 UTC m=+227.818983786" watchObservedRunningTime="2026-03-10 15:09:54.707256759 +0000 UTC m=+227.872997647" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.709266 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" podStartSLOduration=159.709255235 podStartE2EDuration="2m39.709255235s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.706800006 +0000 UTC m=+227.872540904" watchObservedRunningTime="2026-03-10 15:09:54.709255235 +0000 UTC m=+227.874996133" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.789336 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-hj256" podStartSLOduration=6.789319689 podStartE2EDuration="6.789319689s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.75346409 +0000 UTC m=+227.919204988" watchObservedRunningTime="2026-03-10 15:09:54.789319689 +0000 UTC m=+227.955060587" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.797208 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.798669 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.298637272 +0000 UTC m=+228.464378170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.849406 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" podStartSLOduration=159.849389361 podStartE2EDuration="2m39.849389361s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.796405499 +0000 UTC m=+227.962146397" watchObservedRunningTime="2026-03-10 15:09:54.849389361 +0000 UTC m=+228.015130259" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.850753 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" podStartSLOduration=159.850746599 podStartE2EDuration="2m39.850746599s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.849782262 +0000 UTC m=+228.015523160" watchObservedRunningTime="2026-03-10 15:09:54.850746599 +0000 UTC m=+228.016487497" Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.899453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:54 crc kubenswrapper[4795]: E0310 15:09:54.899812 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.39979463 +0000 UTC m=+228.565535528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:54 crc kubenswrapper[4795]: I0310 15:09:54.941516 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-v4bxk" podStartSLOduration=159.941495394 podStartE2EDuration="2m39.941495394s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.897363071 +0000 UTC m=+228.063103979" watchObservedRunningTime="2026-03-10 15:09:54.941495394 +0000 UTC m=+228.107236292" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.001007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.001431 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.501413881 +0000 UTC m=+228.667154779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.005211 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-2k5kv" podStartSLOduration=160.005193767 podStartE2EDuration="2m40.005193767s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:54.945103005 +0000 UTC m=+228.110843913" watchObservedRunningTime="2026-03-10 15:09:55.005193767 +0000 UTC m=+228.170934675" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.102090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.102772 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.602754414 +0000 UTC m=+228.768495312 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.110651 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6q2qw" podStartSLOduration=7.110627546 podStartE2EDuration="7.110627546s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:55.076891176 +0000 UTC m=+228.242632074" watchObservedRunningTime="2026-03-10 15:09:55.110627546 +0000 UTC m=+228.276368454" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.136788 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kxzzq" podStartSLOduration=160.136768322 podStartE2EDuration="2m40.136768322s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:55.134002474 +0000 UTC m=+228.299743382" watchObservedRunningTime="2026-03-10 15:09:55.136768322 +0000 UTC m=+228.302509220" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.203985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.204179 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.704157809 +0000 UTC m=+228.869898717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.204256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.204588 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.704578531 +0000 UTC m=+228.870319429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.305092 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.305602 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.805583975 +0000 UTC m=+228.971324873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.332873 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:55 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:55 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:55 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.332929 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.406876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.407408 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:55.907374401 +0000 UTC m=+229.073115299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.508628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.509124 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.009104145 +0000 UTC m=+229.174845043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.610241 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.610564 4795 ???:1] "http: TLS handshake error from 192.168.126.11:54864: no serving certificate available for the kubelet" Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.610534 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.11052154 +0000 UTC m=+229.276262438 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.665325 4795 generic.go:334] "Generic (PLEG): container finished" podID="e13b22c6-cfc8-4709-9610-040c0e80b2c4" containerID="e458b93e7b819374032ddc6a14f957904e15303ca78eb2b79afb009adab4b33a" exitCode=0 Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.665378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" event={"ID":"e13b22c6-cfc8-4709-9610-040c0e80b2c4","Type":"ContainerDied","Data":"e458b93e7b819374032ddc6a14f957904e15303ca78eb2b79afb009adab4b33a"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.689311 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8v6fj" event={"ID":"813b1cb2-7fca-4632-b1e1-a36cb988f944","Type":"ContainerStarted","Data":"f6fa0d8d0b256084e52a17ff80ed7e359cdcb7456e4d560b1f14f06b84ac7816"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.689349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8v6fj" event={"ID":"813b1cb2-7fca-4632-b1e1-a36cb988f944","Type":"ContainerStarted","Data":"f25ee823a369ed7661696031ab3831ddcd3a54466535671f777928b67c0710c9"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.689437 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8v6fj" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.700846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" event={"ID":"cbf0019e-fefb-4cec-a75f-d8984f8fb0b8","Type":"ContainerStarted","Data":"099609765de624719ce7743bfb663693e150eb30d8ba84ae54695d08a42fe8ab"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.701180 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.712278 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.716743 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.21671984 +0000 UTC m=+229.382460748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.724326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6q2qw" event={"ID":"bc9f05ef-47e0-4866-a629-1cf109bf3752","Type":"ContainerStarted","Data":"e1e89392913089032438a7c1822fba7d48afebaacb816d706ce9567538d45157"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.738572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" event={"ID":"3f0f577c-af8a-414f-ad38-1d1d839d472f","Type":"ContainerStarted","Data":"0ed87232b25513804c1761d5a7f36350928e128dfc6b2fb914b1f5e86f75a677"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.738620 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" event={"ID":"3f0f577c-af8a-414f-ad38-1d1d839d472f","Type":"ContainerStarted","Data":"250541087fcd40caa8f505e1e405332da30418478a5a14e7a415ad9abdb26631"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.785406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" event={"ID":"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976","Type":"ContainerStarted","Data":"02e04e77b772628c7e7a2c79583d7189d5485303e245926d95b54efaf153125c"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.807090 4795 generic.go:334] "Generic (PLEG): container finished" podID="03b6f1dc-43b6-489a-8990-7f4a9a33d535" containerID="aea0e70f6e58c56c1e9f636a20f411d3cb8c8d9027e861a9bee7812642b07e3f" exitCode=0 Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.808465 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" event={"ID":"03b6f1dc-43b6-489a-8990-7f4a9a33d535","Type":"ContainerDied","Data":"aea0e70f6e58c56c1e9f636a20f411d3cb8c8d9027e861a9bee7812642b07e3f"} Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.811866 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8v6fj" podStartSLOduration=7.811854219 podStartE2EDuration="7.811854219s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:55.792112883 +0000 UTC m=+228.957853781" watchObservedRunningTime="2026-03-10 15:09:55.811854219 +0000 UTC m=+228.977595117" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.816460 4795 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sqw59 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.816520 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.33:8080/healthz\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.816804 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-bqvzg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.816851 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bqvzg" podUID="8c484e22-84bb-402d-89ec-5251b11ae7e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.826806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-xfqrl" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.830477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.835418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.836937 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.336921184 +0000 UTC m=+229.502662082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.862693 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7csjs" podStartSLOduration=160.862678799 podStartE2EDuration="2m40.862678799s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:55.861476986 +0000 UTC m=+229.027217874" watchObservedRunningTime="2026-03-10 15:09:55.862678799 +0000 UTC m=+229.028419697" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.869401 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dlcdg" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.928905 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" podStartSLOduration=160.928891064 podStartE2EDuration="2m40.928891064s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:55.926651161 +0000 UTC m=+229.092392059" watchObservedRunningTime="2026-03-10 15:09:55.928891064 +0000 UTC m=+229.094631962" Mar 10 15:09:55 crc kubenswrapper[4795]: I0310 15:09:55.952198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:55 crc kubenswrapper[4795]: E0310 15:09:55.953987 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.45397104 +0000 UTC m=+229.619711938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.054787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.055172 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.555160089 +0000 UTC m=+229.720900987 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.155632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.155951 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.655936106 +0000 UTC m=+229.821677004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.257281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.257745 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.757727442 +0000 UTC m=+229.923468340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.292172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-8kghz" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.330793 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:56 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:56 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:56 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.330844 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.359390 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.359858 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.859835827 +0000 UTC m=+230.025576725 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.383939 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.385149 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.395904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.404195 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.463727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.463769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.463793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.463860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45vl\" (UniqueName: \"kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.464145 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:56.964133643 +0000 UTC m=+230.129874541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.564964 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.565140 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.065122226 +0000 UTC m=+230.230863124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565218 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565258 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565286 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k45vl\" (UniqueName: \"kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.565531 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.065524038 +0000 UTC m=+230.231264936 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.565930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.585179 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.586023 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.589114 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45vl\" (UniqueName: \"kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl\") pod \"community-operators-nnfqt\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.589417 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.603843 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.666656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.667991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.668160 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.668293 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xrwm\" (UniqueName: \"kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.668505 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.168489337 +0000 UTC m=+230.334230235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.719378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.771837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.772144 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.772183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.772205 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xrwm\" (UniqueName: \"kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.772488 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.772753 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.772810 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.272799854 +0000 UTC m=+230.438540742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.774091 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.775052 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.807898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xrwm\" (UniqueName: \"kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm\") pod \"certified-operators-kldkn\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.832560 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.850675 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nhjkm" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.873524 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.873744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.873765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m6ts\" (UniqueName: \"kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.873785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.873922 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.37390807 +0000 UTC m=+230.539648968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.875283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" event={"ID":"03b6f1dc-43b6-489a-8990-7f4a9a33d535","Type":"ContainerStarted","Data":"d4d59c9f373317e4c4154449d03b5348451ca2f9bb8d7bdf1220d60af892247f"} Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.907124 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" event={"ID":"e13b22c6-cfc8-4709-9610-040c0e80b2c4","Type":"ContainerStarted","Data":"dd237ae1d789c0457aeefb397333a1a915440b8391c4ef5eb4873c280f5ca4e6"} Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.924372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.978568 4795 generic.go:334] "Generic (PLEG): container finished" podID="6b6c8d10-50e2-458e-a3fd-0b67c039c705" containerID="0d6d43abbd7b03d20bfd1f879854e7939f893d7aea104c9c2362ee9fecc27326" exitCode=0 Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.979427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" event={"ID":"6b6c8d10-50e2-458e-a3fd-0b67c039c705","Type":"ContainerDied","Data":"0d6d43abbd7b03d20bfd1f879854e7939f893d7aea104c9c2362ee9fecc27326"} Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.981895 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.981917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m6ts\" (UniqueName: \"kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.981940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.981977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.982977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: I0310 15:09:56.983388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:56 crc kubenswrapper[4795]: E0310 15:09:56.983600 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.483588958 +0000 UTC m=+230.649329846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.020457 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.021523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.034673 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" podStartSLOduration=162.034658296 podStartE2EDuration="2m42.034658296s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:57.03230594 +0000 UTC m=+230.198046838" watchObservedRunningTime="2026-03-10 15:09:57.034658296 +0000 UTC m=+230.200399194" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.035036 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.084971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.085291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.085928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khz2x\" (UniqueName: \"kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.086061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.088010 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.587983147 +0000 UTC m=+230.753724045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.119303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m6ts\" (UniqueName: \"kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts\") pod \"community-operators-47zhw\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.188358 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khz2x\" (UniqueName: \"kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.188406 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.188481 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.188521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.188869 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.189113 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.189134 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.689087714 +0000 UTC m=+230.854828612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.223704 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.242972 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khz2x\" (UniqueName: \"kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x\") pod \"certified-operators-zhm7h\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: W0310 15:09:57.277499 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33c07b9b_efc7_4610_85d2_21e44611aa32.slice/crio-8a0171baa8e5a64e083a88d4b6997b41dab73898173d2e953b6e5ca065963d60 WatchSource:0}: Error finding container 8a0171baa8e5a64e083a88d4b6997b41dab73898173d2e953b6e5ca065963d60: Status 404 returned error can't find the container with id 8a0171baa8e5a64e083a88d4b6997b41dab73898173d2e953b6e5ca065963d60 Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.296272 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.296613 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.796599031 +0000 UTC m=+230.962339929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.351835 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:57 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:57 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:57 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.351888 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.379132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.397816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.398124 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.898114029 +0000 UTC m=+231.063854927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.408788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.498395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.498851 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.499017 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.998998109 +0000 UTC m=+231.164739007 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.499058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.499390 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:57.99937241 +0000 UTC m=+231.165113308 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.531116 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.563738 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.564053 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.601577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.602085 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.10205415 +0000 UTC m=+231.267795038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.606187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.606555 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.106541276 +0000 UTC m=+231.272282184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.609801 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.707360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.707895 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.207859079 +0000 UTC m=+231.373599967 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.707976 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.709045 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.209036322 +0000 UTC m=+231.374777220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.809482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.809830 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.309816449 +0000 UTC m=+231.475557347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:57 crc kubenswrapper[4795]: I0310 15:09:57.911778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:57 crc kubenswrapper[4795]: E0310 15:09:57.912226 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.412211192 +0000 UTC m=+231.577952090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.016723 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.017415 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.517400924 +0000 UTC m=+231.683141822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.017701 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" event={"ID":"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976","Type":"ContainerStarted","Data":"20523d7b83be03233c6e68e8c8f9a236b1c12bd6bd693ce69ab789c9e70e9fda"} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.019759 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerStarted","Data":"7ae2c6e228797d6b9f29ed0a401d6c6005d2852220fe01214cd195fbc946466c"} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.021143 4795 generic.go:334] "Generic (PLEG): container finished" podID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerID="bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61" exitCode=0 Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.021187 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerDied","Data":"bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61"} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.021205 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerStarted","Data":"8a0171baa8e5a64e083a88d4b6997b41dab73898173d2e953b6e5ca065963d60"} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.067334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" event={"ID":"03b6f1dc-43b6-489a-8990-7f4a9a33d535","Type":"ContainerStarted","Data":"31ea6e654e5cc4cb3c48cc7593f7a94b190ab42249ab12f048261147e0ad54a5"} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.068609 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" podUID="676a93ca-0674-4884-991b-89518b118411" containerName="controller-manager" containerID="cri-o://bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8" gracePeriod=30 Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.094408 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" podStartSLOduration=163.094393612 podStartE2EDuration="2m43.094393612s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:58.089543575 +0000 UTC m=+231.255284463" watchObservedRunningTime="2026-03-10 15:09:58.094393612 +0000 UTC m=+231.260134510" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.111659 4795 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.118509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.122524 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.622508243 +0000 UTC m=+231.788249141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.160813 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.202342 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.220265 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.220562 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.720541833 +0000 UTC m=+231.886282731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.220700 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.221075 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.721065438 +0000 UTC m=+231.886806336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.233262 4795 ???:1] "http: TLS handshake error from 192.168.126.11:59336: no serving certificate available for the kubelet" Mar 10 15:09:58 crc kubenswrapper[4795]: W0310 15:09:58.262675 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda53e736d_c4d5_458a_940d_fd1a0719fc45.slice/crio-b5026d29114e4ecbb0ac1a20cb69c0bce653a82afc55ab4d5383e1878aada4d7 WatchSource:0}: Error finding container b5026d29114e4ecbb0ac1a20cb69c0bce653a82afc55ab4d5383e1878aada4d7: Status 404 returned error can't find the container with id b5026d29114e4ecbb0ac1a20cb69c0bce653a82afc55ab4d5383e1878aada4d7 Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.322010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.322422 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.822408531 +0000 UTC m=+231.988149419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.335052 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:58 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:58 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:58 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.335113 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.410527 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.419224 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.423557 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.425075 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:58.923912859 +0000 UTC m=+232.089653757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.524756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqmrb\" (UniqueName: \"kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb\") pod \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.524806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume\") pod \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.524901 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.524964 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume\") pod \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\" (UID: \"6b6c8d10-50e2-458e-a3fd-0b67c039c705\") " Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.525549 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:59.02552343 +0000 UTC m=+232.191264348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.525918 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b6c8d10-50e2-458e-a3fd-0b67c039c705" (UID: "6b6c8d10-50e2-458e-a3fd-0b67c039c705"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.530580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b6c8d10-50e2-458e-a3fd-0b67c039c705" (UID: "6b6c8d10-50e2-458e-a3fd-0b67c039c705"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.530666 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb" (OuterVolumeSpecName: "kube-api-access-dqmrb") pod "6b6c8d10-50e2-458e-a3fd-0b67c039c705" (UID: "6b6c8d10-50e2-458e-a3fd-0b67c039c705"). InnerVolumeSpecName "kube-api-access-dqmrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.543641 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.573153 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.573368 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="676a93ca-0674-4884-991b-89518b118411" containerName="controller-manager" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.573381 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="676a93ca-0674-4884-991b-89518b118411" containerName="controller-manager" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.573395 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6c8d10-50e2-458e-a3fd-0b67c039c705" containerName="collect-profiles" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.573400 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6c8d10-50e2-458e-a3fd-0b67c039c705" containerName="collect-profiles" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.573483 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6c8d10-50e2-458e-a3fd-0b67c039c705" containerName="collect-profiles" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.573498 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="676a93ca-0674-4884-991b-89518b118411" containerName="controller-manager" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.582854 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.584742 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.587033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.625627 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert\") pod \"676a93ca-0674-4884-991b-89518b118411\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.625899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca\") pod \"676a93ca-0674-4884-991b-89518b118411\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626051 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles\") pod \"676a93ca-0674-4884-991b-89518b118411\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gclg\" (UniqueName: \"kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg\") pod \"676a93ca-0674-4884-991b-89518b118411\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626194 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config\") pod \"676a93ca-0674-4884-991b-89518b118411\" (UID: \"676a93ca-0674-4884-991b-89518b118411\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626404 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w29bh\" (UniqueName: \"kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626604 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626709 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b6c8d10-50e2-458e-a3fd-0b67c039c705-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626732 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqmrb\" (UniqueName: \"kubernetes.io/projected/6b6c8d10-50e2-458e-a3fd-0b67c039c705-kube-api-access-dqmrb\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.626745 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b6c8d10-50e2-458e-a3fd-0b67c039c705-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.631205 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:59.131184285 +0000 UTC m=+232.296925243 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.631519 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "676a93ca-0674-4884-991b-89518b118411" (UID: "676a93ca-0674-4884-991b-89518b118411"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.631788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "676a93ca-0674-4884-991b-89518b118411" (UID: "676a93ca-0674-4884-991b-89518b118411"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.635480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config" (OuterVolumeSpecName: "config") pod "676a93ca-0674-4884-991b-89518b118411" (UID: "676a93ca-0674-4884-991b-89518b118411"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.637713 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg" (OuterVolumeSpecName: "kube-api-access-4gclg") pod "676a93ca-0674-4884-991b-89518b118411" (UID: "676a93ca-0674-4884-991b-89518b118411"). InnerVolumeSpecName "kube-api-access-4gclg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.627473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca" (OuterVolumeSpecName: "client-ca") pod "676a93ca-0674-4884-991b-89518b118411" (UID: "676a93ca-0674-4884-991b-89518b118411"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.727973 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.728284 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-10 15:09:59.228260038 +0000 UTC m=+232.394000936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.728809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.728855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w29bh\" (UniqueName: \"kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.728916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.728943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729054 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729073 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676a93ca-0674-4884-991b-89518b118411-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729084 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729108 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/676a93ca-0674-4884-991b-89518b118411-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729119 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gclg\" (UniqueName: \"kubernetes.io/projected/676a93ca-0674-4884-991b-89518b118411-kube-api-access-4gclg\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.729482 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.728636 4795 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-10T15:09:58.111679608Z","Handler":null,"Name":""} Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.731464 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: E0310 15:09:58.732104 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-10 15:09:59.232070245 +0000 UTC m=+232.397811203 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vn4gm" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.746057 4795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.746116 4795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.757810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w29bh\" (UniqueName: \"kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh\") pod \"redhat-marketplace-9fcm8\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.842927 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.849199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.906989 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.943964 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.945049 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.947920 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.948427 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.948457 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.951234 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.952746 4795 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.952781 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.981272 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.988840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:09:58 crc kubenswrapper[4795]: I0310 15:09:58.988971 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.004249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vn4gm\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.049562 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.049657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.089780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" event={"ID":"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976","Type":"ContainerStarted","Data":"e88d54bad11eeb4c67be7a73500763a97046aa979cc3594c5877a70468e1dc5e"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.089824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" event={"ID":"e4e50d6e-2a2d-43f5-86bb-0aa6c55f3976","Type":"ContainerStarted","Data":"f1bbf8982d006f1b387f68c35198ffa28e5b4386b4c941cf0871df1452f9ce23"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.102333 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerID="a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf" exitCode=0 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.102408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerDied","Data":"a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.105051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" event={"ID":"6b6c8d10-50e2-458e-a3fd-0b67c039c705","Type":"ContainerDied","Data":"81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.105082 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81693494ddd2bce2134a622f49a06389cc353703632c8b1d590e4e47174812c0" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.105217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.116625 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-h8mc4" podStartSLOduration=11.116608511999999 podStartE2EDuration="11.116608512s" podCreationTimestamp="2026-03-10 15:09:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:09:59.113235757 +0000 UTC m=+232.278976675" watchObservedRunningTime="2026-03-10 15:09:59.116608512 +0000 UTC m=+232.282349410" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.127082 4795 generic.go:334] "Generic (PLEG): container finished" podID="676a93ca-0674-4884-991b-89518b118411" containerID="bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8" exitCode=0 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.127182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" event={"ID":"676a93ca-0674-4884-991b-89518b118411","Type":"ContainerDied","Data":"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.127209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" event={"ID":"676a93ca-0674-4884-991b-89518b118411","Type":"ContainerDied","Data":"47ad4ad3e4bcfa51c34f837f8ff117d7d66181db71d88336ad57ba89f0c6d154"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.127257 4795 scope.go:117] "RemoveContainer" containerID="bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.127275 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-cg27q" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.136773 4795 generic.go:334] "Generic (PLEG): container finished" podID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerID="40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd" exitCode=0 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.136835 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerDied","Data":"40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.136859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerStarted","Data":"b5026d29114e4ecbb0ac1a20cb69c0bce653a82afc55ab4d5383e1878aada4d7"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.144232 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerID="395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958" exitCode=0 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.144281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerDied","Data":"395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.144326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerStarted","Data":"9e5d6fbff173297531524291190a580facfd00e037a31cfee291ea277c01275d"} Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.146109 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerName="route-controller-manager" containerID="cri-o://550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e" gracePeriod=30 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g57p7\" (UniqueName: \"kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.151456 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.167852 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6slmv" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.179854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.189264 4795 scope.go:117] "RemoveContainer" containerID="bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8" Mar 10 15:09:59 crc kubenswrapper[4795]: E0310 15:09:59.192192 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8\": container with ID starting with bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8 not found: ID does not exist" containerID="bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.192234 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8"} err="failed to get container status \"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8\": rpc error: code = NotFound desc = could not find container \"bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8\": container with ID starting with bad685b92cfb94eab361a5918c1c2ddde0fe26a8347a1da39c1f89cd42f425f8 not found: ID does not exist" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.227006 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.232454 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.247664 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-cg27q"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.252626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.252788 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.252831 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g57p7\" (UniqueName: \"kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.254901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.255436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.276358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.292303 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g57p7\" (UniqueName: \"kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7\") pod \"redhat-marketplace-4jngq\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.323821 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.334397 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:09:59 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:09:59 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:09:59 crc kubenswrapper[4795]: healthz check failed Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.334734 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.396225 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:09:59 crc kubenswrapper[4795]: W0310 15:09:59.424038 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae045732_f556_4808_bcd3_114aed4f8414.slice/crio-d5acd2b5f7207fb31aac14d3fa256298b93479ff0d6c54300d834dc404a54d58 WatchSource:0}: Error finding container d5acd2b5f7207fb31aac14d3fa256298b93479ff0d6c54300d834dc404a54d58: Status 404 returned error can't find the container with id d5acd2b5f7207fb31aac14d3fa256298b93479ff0d6c54300d834dc404a54d58 Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.484655 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676a93ca-0674-4884-991b-89518b118411" path="/var/lib/kubelet/pods/676a93ca-0674-4884-991b-89518b118411/volumes" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.485393 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.568160 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.570963 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.574311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.577334 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.662938 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.663029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.663053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.744313 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.757461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.764574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.764610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.764677 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.765058 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.765324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.786527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh\") pod \"redhat-operators-rltg9\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.799683 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.865414 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") pod \"4705e450-7ac4-4741-bab8-e17cb6a79050\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.865474 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") pod \"4705e450-7ac4-4741-bab8-e17cb6a79050\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.865489 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") pod \"4705e450-7ac4-4741-bab8-e17cb6a79050\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.865519 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7v9s\" (UniqueName: \"kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s\") pod \"4705e450-7ac4-4741-bab8-e17cb6a79050\" (UID: \"4705e450-7ac4-4741-bab8-e17cb6a79050\") " Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.866696 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config" (OuterVolumeSpecName: "config") pod "4705e450-7ac4-4741-bab8-e17cb6a79050" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.867173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca" (OuterVolumeSpecName: "client-ca") pod "4705e450-7ac4-4741-bab8-e17cb6a79050" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.869869 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4705e450-7ac4-4741-bab8-e17cb6a79050" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.873102 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s" (OuterVolumeSpecName: "kube-api-access-p7v9s") pod "4705e450-7ac4-4741-bab8-e17cb6a79050" (UID: "4705e450-7ac4-4741-bab8-e17cb6a79050"). InnerVolumeSpecName "kube-api-access-p7v9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.894209 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.953518 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.972881 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.972910 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4705e450-7ac4-4741-bab8-e17cb6a79050-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.972919 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7v9s\" (UniqueName: \"kubernetes.io/projected/4705e450-7ac4-4741-bab8-e17cb6a79050-kube-api-access-p7v9s\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.972928 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4705e450-7ac4-4741-bab8-e17cb6a79050-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.974509 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:09:59 crc kubenswrapper[4795]: E0310 15:09:59.974927 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerName="route-controller-manager" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.974941 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerName="route-controller-manager" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.975262 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerName="route-controller-manager" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.976571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:09:59 crc kubenswrapper[4795]: I0310 15:09:59.980756 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:09:59 crc kubenswrapper[4795]: W0310 15:09:59.983137 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod20fd9386_5181_4104_8413_7191c2c79b4b.slice/crio-8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1 WatchSource:0}: Error finding container 8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1: Status 404 returned error can't find the container with id 8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1 Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.074070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgszd\" (UniqueName: \"kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.074828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.075123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.166927 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.167842 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.173341 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.173672 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174049 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174263 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174291 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174397 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174529 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.174616 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.176535 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgszd\" (UniqueName: \"kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.177646 4795 generic.go:334] "Generic (PLEG): container finished" podID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerID="6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec" exitCode=0 Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.178516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2x5qv"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.179325 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.179565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.179668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.181956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.182250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerDied","Data":"6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190738 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190757 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2x5qv"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerStarted","Data":"fc09af7ff195160ae5b7cd61ab03dff0661e1d6d0f144404a93c11393d415a42"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190777 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.190856 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.196308 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.209002 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"20fd9386-5181-4104-8413-7191c2c79b4b","Type":"ContainerStarted","Data":"8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.222126 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgszd\" (UniqueName: \"kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd\") pod \"redhat-operators-gchlt\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.229161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" event={"ID":"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16","Type":"ContainerStarted","Data":"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.229196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" event={"ID":"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16","Type":"ContainerStarted","Data":"2563dbac85073daa3fec6c680cc18882fd6fd7f4d86394af5b2ddd43e5ed721c"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.229555 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.239051 4795 generic.go:334] "Generic (PLEG): container finished" podID="4705e450-7ac4-4741-bab8-e17cb6a79050" containerID="550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e" exitCode=0 Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.239211 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.239216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" event={"ID":"4705e450-7ac4-4741-bab8-e17cb6a79050","Type":"ContainerDied","Data":"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.239606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss" event={"ID":"4705e450-7ac4-4741-bab8-e17cb6a79050","Type":"ContainerDied","Data":"fc6fdd390c61b16ea022ce339c56c802e61d6625c612425a875dbbe323b0457b"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.239633 4795 scope.go:117] "RemoveContainer" containerID="550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.247463 4795 generic.go:334] "Generic (PLEG): container finished" podID="ae045732-f556-4808-bcd3-114aed4f8414" containerID="6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4" exitCode=0 Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.248135 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerDied","Data":"6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.248174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerStarted","Data":"d5acd2b5f7207fb31aac14d3fa256298b93479ff0d6c54300d834dc404a54d58"} Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.273155 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.288602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.288752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.288780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27lt4\" (UniqueName: \"kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4\") pod \"auto-csr-approver-29552590-2x5qv\" (UID: \"4d15efc4-4cee-459e-b10d-e0452d172fc7\") " pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.288950 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289157 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7k8\" (UniqueName: \"kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289255 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w476m\" (UniqueName: \"kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.289550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.298194 4795 scope.go:117] "RemoveContainer" containerID="550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e" Mar 10 15:10:00 crc kubenswrapper[4795]: E0310 15:10:00.299863 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e\": container with ID starting with 550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e not found: ID does not exist" containerID="550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.299894 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e"} err="failed to get container status \"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e\": rpc error: code = NotFound desc = could not find container \"550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e\": container with ID starting with 550d2170b085424fbe3ee52858196460cbc463651f6938be64739e3b4152127e not found: ID does not exist" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.311219 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.318780 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" podStartSLOduration=165.318763088 podStartE2EDuration="2m45.318763088s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:00.309377864 +0000 UTC m=+233.475118812" watchObservedRunningTime="2026-03-10 15:10:00.318763088 +0000 UTC m=+233.484503986" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.322678 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.324623 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-45xss"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.332471 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:00 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:00 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:00 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.332533 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.390418 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7k8\" (UniqueName: \"kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.390477 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.390538 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w476m\" (UniqueName: \"kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392653 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27lt4\" (UniqueName: \"kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4\") pod \"auto-csr-approver-29552590-2x5qv\" (UID: \"4d15efc4-4cee-459e-b10d-e0452d172fc7\") " pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392041 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.392324 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.394818 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.395269 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.395912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.397595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.399343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.407759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7k8\" (UniqueName: \"kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8\") pod \"route-controller-manager-8574fc688-ffw84\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.409735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w476m\" (UniqueName: \"kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m\") pod \"controller-manager-7c444b4558-xgcsw\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.409747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27lt4\" (UniqueName: \"kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4\") pod \"auto-csr-approver-29552590-2x5qv\" (UID: \"4d15efc4-4cee-459e-b10d-e0452d172fc7\") " pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.503013 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.530181 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.537644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.695636 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.717826 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.718644 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.722765 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.722908 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.725927 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:10:00 crc kubenswrapper[4795]: W0310 15:10:00.742176 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88cbf782_9591_4d58_8a30_e4b145687dac.slice/crio-12a8f365eaccbfd76231de0f05a94cdd1b72108c6b2e38e7ab31e29382f9cb9f WatchSource:0}: Error finding container 12a8f365eaccbfd76231de0f05a94cdd1b72108c6b2e38e7ab31e29382f9cb9f: Status 404 returned error can't find the container with id 12a8f365eaccbfd76231de0f05a94cdd1b72108c6b2e38e7ab31e29382f9cb9f Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.864743 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-bqvzg container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.864791 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bqvzg" podUID="8c484e22-84bb-402d-89ec-5251b11ae7e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.866236 4795 patch_prober.go:28] interesting pod/downloads-7954f5f757-bqvzg container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.866261 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bqvzg" podUID="8c484e22-84bb-402d-89ec-5251b11ae7e3" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.878531 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:00 crc kubenswrapper[4795]: W0310 15:10:00.890866 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a15957_8d19_470a_85fe_f651ff58f8ae.slice/crio-ebd3161bb844bfcdaf0ead4f9c34a2c12c939a3fa0acb9ad7b9e2fc314676d78 WatchSource:0}: Error finding container ebd3161bb844bfcdaf0ead4f9c34a2c12c939a3fa0acb9ad7b9e2fc314676d78: Status 404 returned error can't find the container with id ebd3161bb844bfcdaf0ead4f9c34a2c12c939a3fa0acb9ad7b9e2fc314676d78 Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.900606 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:00 crc kubenswrapper[4795]: I0310 15:10:00.900706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.001447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.001755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.001847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.019616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.051208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.133640 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.220256 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2x5qv"] Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.223141 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.223182 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.228129 4795 patch_prober.go:28] interesting pod/console-f9d7485db-g452c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.230849 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g452c" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 15:10:01 crc kubenswrapper[4795]: W0310 15:10:01.256810 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d15efc4_4cee_459e_b10d_e0452d172fc7.slice/crio-1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050 WatchSource:0}: Error finding container 1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050: Status 404 returned error can't find the container with id 1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050 Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.259163 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" event={"ID":"c06a6322-431a-4dab-9935-ca91ee94bc49","Type":"ContainerStarted","Data":"e8c80f913dd32a1a02e91571cba97f7b03104ab8028e6522860af5a896874230"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.263781 4795 generic.go:334] "Generic (PLEG): container finished" podID="20fd9386-5181-4104-8413-7191c2c79b4b" containerID="32f0cf86f4bb81e49982333572d93e5379888ce6d20a3382b633f630a396f1e7" exitCode=0 Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.263849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"20fd9386-5181-4104-8413-7191c2c79b4b","Type":"ContainerDied","Data":"32f0cf86f4bb81e49982333572d93e5379888ce6d20a3382b633f630a396f1e7"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.273893 4795 generic.go:334] "Generic (PLEG): container finished" podID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerID="547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743" exitCode=0 Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.273978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerDied","Data":"547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.285333 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerStarted","Data":"c3bcbd32d74bd51adb26c21cbdcf0562c0521721b518f67232b5af84c5a92fd8"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.325593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" event={"ID":"37a15957-8d19-470a-85fe-f651ff58f8ae","Type":"ContainerStarted","Data":"faac96f501f2f03c7e4cacc74b58e288537da56e94c2b4c7227900d5531c2a61"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.325633 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" event={"ID":"37a15957-8d19-470a-85fe-f651ff58f8ae","Type":"ContainerStarted","Data":"ebd3161bb844bfcdaf0ead4f9c34a2c12c939a3fa0acb9ad7b9e2fc314676d78"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.325792 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.330746 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.336455 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.344874 4795 generic.go:334] "Generic (PLEG): container finished" podID="88cbf782-9591-4d58-8a30-e4b145687dac" containerID="178367b6e19db31876f17fa10ae59fa3591b01ad4dd32bb542a95c7e56ad7eb2" exitCode=0 Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.345739 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerDied","Data":"178367b6e19db31876f17fa10ae59fa3591b01ad4dd32bb542a95c7e56ad7eb2"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.345797 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerStarted","Data":"12a8f365eaccbfd76231de0f05a94cdd1b72108c6b2e38e7ab31e29382f9cb9f"} Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.348465 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:01 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:01 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:01 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.348503 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.353654 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" podStartSLOduration=3.353636313 podStartE2EDuration="3.353636313s" podCreationTimestamp="2026-03-10 15:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:01.351537084 +0000 UTC m=+234.517277992" watchObservedRunningTime="2026-03-10 15:10:01.353636313 +0000 UTC m=+234.519377211" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.487612 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4705e450-7ac4-4741-bab8-e17cb6a79050" path="/var/lib/kubelet/pods/4705e450-7ac4-4741-bab8-e17cb6a79050/volumes" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.546109 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:10:01 crc kubenswrapper[4795]: I0310 15:10:01.782679 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 10 15:10:01 crc kubenswrapper[4795]: W0310 15:10:01.830872 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc81da19_de68_4af4_8478_018528760464.slice/crio-d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20 WatchSource:0}: Error finding container d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20: Status 404 returned error can't find the container with id d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20 Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.329809 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:02 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:02 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:02 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.329858 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.361044 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" event={"ID":"4d15efc4-4cee-459e-b10d-e0452d172fc7","Type":"ContainerStarted","Data":"1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050"} Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.363095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc81da19-de68-4af4-8478-018528760464","Type":"ContainerStarted","Data":"d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20"} Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.370935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" event={"ID":"c06a6322-431a-4dab-9935-ca91ee94bc49","Type":"ContainerStarted","Data":"08104485c95719b961b1e5836e02ccd422e49bac0640deb82d65d1ba0a4f7e61"} Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.371532 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.378780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.410576 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" podStartSLOduration=4.410556321 podStartE2EDuration="4.410556321s" podCreationTimestamp="2026-03-10 15:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:02.392172443 +0000 UTC m=+235.557913341" watchObservedRunningTime="2026-03-10 15:10:02.410556321 +0000 UTC m=+235.576297219" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.576150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.577238 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.582804 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.692971 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.756055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access\") pod \"20fd9386-5181-4104-8413-7191c2c79b4b\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.756179 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir\") pod \"20fd9386-5181-4104-8413-7191c2c79b4b\" (UID: \"20fd9386-5181-4104-8413-7191c2c79b4b\") " Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.756471 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "20fd9386-5181-4104-8413-7191c2c79b4b" (UID: "20fd9386-5181-4104-8413-7191c2c79b4b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.766715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "20fd9386-5181-4104-8413-7191c2c79b4b" (UID: "20fd9386-5181-4104-8413-7191c2c79b4b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.857660 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20fd9386-5181-4104-8413-7191c2c79b4b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:02 crc kubenswrapper[4795]: I0310 15:10:02.857786 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20fd9386-5181-4104-8413-7191c2c79b4b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.331116 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:03 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:03 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:03 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.331451 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.377515 4795 ???:1] "http: TLS handshake error from 192.168.126.11:59342: no serving certificate available for the kubelet" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.389932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"20fd9386-5181-4104-8413-7191c2c79b4b","Type":"ContainerDied","Data":"8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1"} Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.389967 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.389982 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa1e6600927a9d5ddcf09e711080deaf4c57a803c4f21c9d1f6d0bd8e7fadf1" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.395348 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc81da19-de68-4af4-8478-018528760464","Type":"ContainerStarted","Data":"f08b5e6f6d51944b607b04a938c9991c425a5a67edf3a9e83f4963d7db51a992"} Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.399479 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-cp7r2" Mar 10 15:10:03 crc kubenswrapper[4795]: I0310 15:10:03.415506 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.415490434 podStartE2EDuration="3.415490434s" podCreationTimestamp="2026-03-10 15:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:03.412290174 +0000 UTC m=+236.578031082" watchObservedRunningTime="2026-03-10 15:10:03.415490434 +0000 UTC m=+236.581231332" Mar 10 15:10:04 crc kubenswrapper[4795]: I0310 15:10:04.329262 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:04 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:04 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:04 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:04 crc kubenswrapper[4795]: I0310 15:10:04.329386 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:04 crc kubenswrapper[4795]: I0310 15:10:04.358776 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8v6fj" Mar 10 15:10:04 crc kubenswrapper[4795]: I0310 15:10:04.407368 4795 generic.go:334] "Generic (PLEG): container finished" podID="bc81da19-de68-4af4-8478-018528760464" containerID="f08b5e6f6d51944b607b04a938c9991c425a5a67edf3a9e83f4963d7db51a992" exitCode=0 Mar 10 15:10:04 crc kubenswrapper[4795]: I0310 15:10:04.407382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc81da19-de68-4af4-8478-018528760464","Type":"ContainerDied","Data":"f08b5e6f6d51944b607b04a938c9991c425a5a67edf3a9e83f4963d7db51a992"} Mar 10 15:10:05 crc kubenswrapper[4795]: I0310 15:10:05.331940 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:05 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:05 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:05 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:05 crc kubenswrapper[4795]: I0310 15:10:05.331996 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:06 crc kubenswrapper[4795]: I0310 15:10:06.162051 4795 ???:1] "http: TLS handshake error from 192.168.126.11:59352: no serving certificate available for the kubelet" Mar 10 15:10:06 crc kubenswrapper[4795]: I0310 15:10:06.328963 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:06 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:06 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:06 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:06 crc kubenswrapper[4795]: I0310 15:10:06.329587 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:07 crc kubenswrapper[4795]: I0310 15:10:07.329753 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:07 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:07 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:07 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:07 crc kubenswrapper[4795]: I0310 15:10:07.329801 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:08 crc kubenswrapper[4795]: I0310 15:10:08.331016 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:08 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:08 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:08 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:08 crc kubenswrapper[4795]: I0310 15:10:08.331596 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.328182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.330572 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:09 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:09 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:09 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.330614 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.331243 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.360827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3036349b-f184-48aa-b5ab-de9c5c7ae511-metrics-certs\") pod \"network-metrics-daemon-zjg2f\" (UID: \"3036349b-f184-48aa-b5ab-de9c5c7ae511\") " pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.400933 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:10:09 crc kubenswrapper[4795]: I0310 15:10:09.410254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zjg2f" Mar 10 15:10:10 crc kubenswrapper[4795]: I0310 15:10:10.329750 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:10 crc kubenswrapper[4795]: [-]has-synced failed: reason withheld Mar 10 15:10:10 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:10 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:10 crc kubenswrapper[4795]: I0310 15:10:10.329855 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:10 crc kubenswrapper[4795]: I0310 15:10:10.902946 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bqvzg" Mar 10 15:10:11 crc kubenswrapper[4795]: I0310 15:10:11.223084 4795 patch_prober.go:28] interesting pod/console-f9d7485db-g452c container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 10 15:10:11 crc kubenswrapper[4795]: I0310 15:10:11.223130 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-g452c" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" probeResult="failure" output="Get \"https://10.217.0.27:8443/health\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 10 15:10:11 crc kubenswrapper[4795]: I0310 15:10:11.328765 4795 patch_prober.go:28] interesting pod/router-default-5444994796-xg845 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 10 15:10:11 crc kubenswrapper[4795]: [+]has-synced ok Mar 10 15:10:11 crc kubenswrapper[4795]: [+]process-running ok Mar 10 15:10:11 crc kubenswrapper[4795]: healthz check failed Mar 10 15:10:11 crc kubenswrapper[4795]: I0310 15:10:11.328836 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xg845" podUID="01f41b58-e54b-4f47-bd01-a11f9078087c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.329773 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.339029 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xg845" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.546566 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.713852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access\") pod \"bc81da19-de68-4af4-8478-018528760464\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.713905 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir\") pod \"bc81da19-de68-4af4-8478-018528760464\" (UID: \"bc81da19-de68-4af4-8478-018528760464\") " Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.714179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc81da19-de68-4af4-8478-018528760464" (UID: "bc81da19-de68-4af4-8478-018528760464"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.719355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc81da19-de68-4af4-8478-018528760464" (UID: "bc81da19-de68-4af4-8478-018528760464"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.815776 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc81da19-de68-4af4-8478-018528760464-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:12 crc kubenswrapper[4795]: I0310 15:10:12.815803 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc81da19-de68-4af4-8478-018528760464-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:13 crc kubenswrapper[4795]: I0310 15:10:13.471336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"bc81da19-de68-4af4-8478-018528760464","Type":"ContainerDied","Data":"d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20"} Mar 10 15:10:13 crc kubenswrapper[4795]: I0310 15:10:13.471669 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d5ed866afdeddcda941012f7a25dca9ccb03202ffbb5913f4ab37dfed60d20" Mar 10 15:10:13 crc kubenswrapper[4795]: I0310 15:10:13.471390 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 10 15:10:13 crc kubenswrapper[4795]: I0310 15:10:13.640644 4795 ???:1] "http: TLS handshake error from 192.168.126.11:40172: no serving certificate available for the kubelet" Mar 10 15:10:16 crc kubenswrapper[4795]: I0310 15:10:16.973019 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:16 crc kubenswrapper[4795]: I0310 15:10:16.973896 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerName="controller-manager" containerID="cri-o://faac96f501f2f03c7e4cacc74b58e288537da56e94c2b4c7227900d5531c2a61" gracePeriod=30 Mar 10 15:10:16 crc kubenswrapper[4795]: I0310 15:10:16.976652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:16 crc kubenswrapper[4795]: I0310 15:10:16.976905 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerName="route-controller-manager" containerID="cri-o://08104485c95719b961b1e5836e02ccd422e49bac0640deb82d65d1ba0a4f7e61" gracePeriod=30 Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.503137 4795 generic.go:334] "Generic (PLEG): container finished" podID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerID="faac96f501f2f03c7e4cacc74b58e288537da56e94c2b4c7227900d5531c2a61" exitCode=0 Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.503181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" event={"ID":"37a15957-8d19-470a-85fe-f651ff58f8ae","Type":"ContainerDied","Data":"faac96f501f2f03c7e4cacc74b58e288537da56e94c2b4c7227900d5531c2a61"} Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.505283 4795 generic.go:334] "Generic (PLEG): container finished" podID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerID="08104485c95719b961b1e5836e02ccd422e49bac0640deb82d65d1ba0a4f7e61" exitCode=0 Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.505330 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" event={"ID":"c06a6322-431a-4dab-9935-ca91ee94bc49","Type":"ContainerDied","Data":"08104485c95719b961b1e5836e02ccd422e49bac0640deb82d65d1ba0a4f7e61"} Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.538713 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:18 crc kubenswrapper[4795]: I0310 15:10:18.538797 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:19 crc kubenswrapper[4795]: I0310 15:10:19.232624 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:10:20 crc kubenswrapper[4795]: I0310 15:10:20.504570 4795 patch_prober.go:28] interesting pod/controller-manager-7c444b4558-xgcsw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 10 15:10:20 crc kubenswrapper[4795]: I0310 15:10:20.504627 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 10 15:10:21 crc kubenswrapper[4795]: I0310 15:10:21.228483 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:10:21 crc kubenswrapper[4795]: I0310 15:10:21.239472 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:10:21 crc kubenswrapper[4795]: I0310 15:10:21.531709 4795 patch_prober.go:28] interesting pod/route-controller-manager-8574fc688-ffw84 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 10 15:10:21 crc kubenswrapper[4795]: I0310 15:10:21.531780 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.697190 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.727609 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:22 crc kubenswrapper[4795]: E0310 15:10:22.732836 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20fd9386-5181-4104-8413-7191c2c79b4b" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.732873 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="20fd9386-5181-4104-8413-7191c2c79b4b" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: E0310 15:10:22.732896 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc81da19-de68-4af4-8478-018528760464" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.732907 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc81da19-de68-4af4-8478-018528760464" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: E0310 15:10:22.732919 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerName="route-controller-manager" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.732927 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerName="route-controller-manager" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.733128 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="20fd9386-5181-4104-8413-7191c2c79b4b" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.733144 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc81da19-de68-4af4-8478-018528760464" containerName="pruner" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.733159 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" containerName="route-controller-manager" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.733725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.748101 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.759048 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert\") pod \"c06a6322-431a-4dab-9935-ca91ee94bc49\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.759137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config\") pod \"c06a6322-431a-4dab-9935-ca91ee94bc49\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.759321 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca\") pod \"c06a6322-431a-4dab-9935-ca91ee94bc49\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.759386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7k8\" (UniqueName: \"kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8\") pod \"c06a6322-431a-4dab-9935-ca91ee94bc49\" (UID: \"c06a6322-431a-4dab-9935-ca91ee94bc49\") " Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.759984 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config" (OuterVolumeSpecName: "config") pod "c06a6322-431a-4dab-9935-ca91ee94bc49" (UID: "c06a6322-431a-4dab-9935-ca91ee94bc49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760018 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca" (OuterVolumeSpecName: "client-ca") pod "c06a6322-431a-4dab-9935-ca91ee94bc49" (UID: "c06a6322-431a-4dab-9935-ca91ee94bc49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgpt2\" (UniqueName: \"kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760550 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760648 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.760663 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c06a6322-431a-4dab-9935-ca91ee94bc49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.770014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c06a6322-431a-4dab-9935-ca91ee94bc49" (UID: "c06a6322-431a-4dab-9935-ca91ee94bc49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.770019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8" (OuterVolumeSpecName: "kube-api-access-8d7k8") pod "c06a6322-431a-4dab-9935-ca91ee94bc49" (UID: "c06a6322-431a-4dab-9935-ca91ee94bc49"). InnerVolumeSpecName "kube-api-access-8d7k8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgpt2\" (UniqueName: \"kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861638 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861803 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c06a6322-431a-4dab-9935-ca91ee94bc49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.861902 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7k8\" (UniqueName: \"kubernetes.io/projected/c06a6322-431a-4dab-9935-ca91ee94bc49-kube-api-access-8d7k8\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.862915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.863244 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.864736 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:22 crc kubenswrapper[4795]: I0310 15:10:22.877953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgpt2\" (UniqueName: \"kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2\") pod \"route-controller-manager-b45454d48-2sjs7\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.062214 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.530969 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" event={"ID":"c06a6322-431a-4dab-9935-ca91ee94bc49","Type":"ContainerDied","Data":"e8c80f913dd32a1a02e91571cba97f7b03104ab8028e6522860af5a896874230"} Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.531040 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84" Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.531329 4795 scope.go:117] "RemoveContainer" containerID="08104485c95719b961b1e5836e02ccd422e49bac0640deb82d65d1ba0a4f7e61" Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.548364 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:23 crc kubenswrapper[4795]: I0310 15:10:23.550826 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8574fc688-ffw84"] Mar 10 15:10:24 crc kubenswrapper[4795]: E0310 15:10:24.136375 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:10:24 crc kubenswrapper[4795]: E0310 15:10:24.136553 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6m6ts,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-47zhw_openshift-marketplace(2ead40a3-f58b-4385-9cad-7a1d04229bb3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:24 crc kubenswrapper[4795]: E0310 15:10:24.138274 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-47zhw" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.483786 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c06a6322-431a-4dab-9935-ca91ee94bc49" path="/var/lib/kubelet/pods/c06a6322-431a-4dab-9935-ca91ee94bc49/volumes" Mar 10 15:10:25 crc kubenswrapper[4795]: E0310 15:10:25.629442 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-47zhw" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.681636 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.704128 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert\") pod \"37a15957-8d19-470a-85fe-f651ff58f8ae\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.704169 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config\") pod \"37a15957-8d19-470a-85fe-f651ff58f8ae\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.704201 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles\") pod \"37a15957-8d19-470a-85fe-f651ff58f8ae\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.704221 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w476m\" (UniqueName: \"kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m\") pod \"37a15957-8d19-470a-85fe-f651ff58f8ae\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.704285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca\") pod \"37a15957-8d19-470a-85fe-f651ff58f8ae\" (UID: \"37a15957-8d19-470a-85fe-f651ff58f8ae\") " Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.705193 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca" (OuterVolumeSpecName: "client-ca") pod "37a15957-8d19-470a-85fe-f651ff58f8ae" (UID: "37a15957-8d19-470a-85fe-f651ff58f8ae"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.705394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "37a15957-8d19-470a-85fe-f651ff58f8ae" (UID: "37a15957-8d19-470a-85fe-f651ff58f8ae"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.705786 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config" (OuterVolumeSpecName: "config") pod "37a15957-8d19-470a-85fe-f651ff58f8ae" (UID: "37a15957-8d19-470a-85fe-f651ff58f8ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.720963 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:25 crc kubenswrapper[4795]: E0310 15:10:25.721186 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerName="controller-manager" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.721200 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerName="controller-manager" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.721312 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" containerName="controller-manager" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.721608 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.721689 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.735988 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "37a15957-8d19-470a-85fe-f651ff58f8ae" (UID: "37a15957-8d19-470a-85fe-f651ff58f8ae"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.737403 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m" (OuterVolumeSpecName: "kube-api-access-w476m") pod "37a15957-8d19-470a-85fe-f651ff58f8ae" (UID: "37a15957-8d19-470a-85fe-f651ff58f8ae"). InnerVolumeSpecName "kube-api-access-w476m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806110 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806194 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rfg\" (UniqueName: \"kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806331 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806399 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806481 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37a15957-8d19-470a-85fe-f651ff58f8ae-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806499 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806512 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806527 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w476m\" (UniqueName: \"kubernetes.io/projected/37a15957-8d19-470a-85fe-f651ff58f8ae-kube-api-access-w476m\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.806539 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/37a15957-8d19-470a-85fe-f651ff58f8ae-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.907141 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rfg\" (UniqueName: \"kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.907195 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.907237 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.907270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.907322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.908137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.910163 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.910318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.912031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:25 crc kubenswrapper[4795]: I0310 15:10:25.926826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rfg\" (UniqueName: \"kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg\") pod \"controller-manager-74599d5d6b-sm6b6\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:26 crc kubenswrapper[4795]: I0310 15:10:26.043488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:26 crc kubenswrapper[4795]: I0310 15:10:26.547621 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" event={"ID":"37a15957-8d19-470a-85fe-f651ff58f8ae","Type":"ContainerDied","Data":"ebd3161bb844bfcdaf0ead4f9c34a2c12c939a3fa0acb9ad7b9e2fc314676d78"} Mar 10 15:10:26 crc kubenswrapper[4795]: I0310 15:10:26.547688 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c444b4558-xgcsw" Mar 10 15:10:26 crc kubenswrapper[4795]: E0310 15:10:26.566179 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 15:10:26 crc kubenswrapper[4795]: E0310 15:10:26.566284 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:10:26 crc kubenswrapper[4795]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 15:10:26 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9rr26,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552588-zdmvt_openshift-infra(c868aa84-d232-4d80-bff3-d9e0aa659769): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 15:10:26 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:10:26 crc kubenswrapper[4795]: E0310 15:10:26.567477 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" Mar 10 15:10:26 crc kubenswrapper[4795]: I0310 15:10:26.576359 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:26 crc kubenswrapper[4795]: I0310 15:10:26.579128 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c444b4558-xgcsw"] Mar 10 15:10:27 crc kubenswrapper[4795]: I0310 15:10:27.482896 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37a15957-8d19-470a-85fe-f651ff58f8ae" path="/var/lib/kubelet/pods/37a15957-8d19-470a-85fe-f651ff58f8ae/volumes" Mar 10 15:10:27 crc kubenswrapper[4795]: E0310 15:10:27.552985 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.809833 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.810291 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8xrwm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-kldkn_openshift-marketplace(9ac91d8b-c0bc-4758-91d3-9fa275d88e02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.813007 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-kldkn" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.818493 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.818656 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dgszd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gchlt_openshift-marketplace(88cbf782-9591-4d58-8a30-e4b145687dac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:29 crc kubenswrapper[4795]: E0310 15:10:29.820127 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gchlt" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.861503 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gchlt" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.861701 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-kldkn" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" Mar 10 15:10:30 crc kubenswrapper[4795]: I0310 15:10:30.898422 4795 scope.go:117] "RemoveContainer" containerID="faac96f501f2f03c7e4cacc74b58e288537da56e94c2b4c7227900d5531c2a61" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.941119 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.941298 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g57p7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4jngq_openshift-marketplace(af8bfe8e-d848-4c0c-954d-6eadb84fbe0f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.942459 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4jngq" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.970079 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.970217 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w29bh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9fcm8_openshift-marketplace(ae045732-f556-4808-bcd3-114aed4f8414): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.971753 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9fcm8" podUID="ae045732-f556-4808-bcd3-114aed4f8414" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.975655 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.975764 4795 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 10 15:10:30 crc kubenswrapper[4795]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 10 15:10:30 crc kubenswrapper[4795]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-27lt4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29552590-2x5qv_openshift-infra(4d15efc4-4cee-459e-b10d-e0452d172fc7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 10 15:10:30 crc kubenswrapper[4795]: > logger="UnhandledError" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.976998 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.992259 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.992392 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k45vl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nnfqt_openshift-marketplace(33c07b9b-efc7-4610-85d2-21e44611aa32): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:30 crc kubenswrapper[4795]: E0310 15:10:30.993800 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nnfqt" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.014606 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.014770 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dmqhh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-rltg9_openshift-marketplace(dbf76fb0-f29e-4d22-ab17-d57b93755cc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.016021 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-rltg9" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.330816 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:31 crc kubenswrapper[4795]: W0310 15:10:31.337248 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc134fad_dd1b_4b75_81d8_48fd32337219.slice/crio-63b6ac159f4af447d624b85dc5fc276869a683547e0cdc8522e59380a3f9d712 WatchSource:0}: Error finding container 63b6ac159f4af447d624b85dc5fc276869a683547e0cdc8522e59380a3f9d712: Status 404 returned error can't find the container with id 63b6ac159f4af447d624b85dc5fc276869a683547e0cdc8522e59380a3f9d712 Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.355462 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.359429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zjg2f"] Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.572940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" event={"ID":"fc134fad-dd1b-4b75-81d8-48fd32337219","Type":"ContainerStarted","Data":"71de4758f103e87c98a9908eba7aa40039dc19309d831061d43f92ea9ff7495a"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.572983 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" event={"ID":"fc134fad-dd1b-4b75-81d8-48fd32337219","Type":"ContainerStarted","Data":"63b6ac159f4af447d624b85dc5fc276869a683547e0cdc8522e59380a3f9d712"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.574037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.575610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" event={"ID":"3036349b-f184-48aa-b5ab-de9c5c7ae511","Type":"ContainerStarted","Data":"8e4faa52f237a9ddd6121be58cfdffd49df0cd51ef01bc83408a6e11a17b1fc5"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.575732 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" event={"ID":"3036349b-f184-48aa-b5ab-de9c5c7ae511","Type":"ContainerStarted","Data":"8b4a6156e26deda7432f67f888efc129497e1c660e42062049154062519b9cec"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.576373 4795 patch_prober.go:28] interesting pod/controller-manager-74599d5d6b-sm6b6 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.576427 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.578880 4795 generic.go:334] "Generic (PLEG): container finished" podID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerID="6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250" exitCode=0 Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.579089 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerDied","Data":"6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.582023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" event={"ID":"71f0ba21-adcd-4f56-828a-07c814a6ea5c","Type":"ContainerStarted","Data":"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.582162 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" event={"ID":"71f0ba21-adcd-4f56-828a-07c814a6ea5c","Type":"ContainerStarted","Data":"8dde9e9361afd8ae70f446c1dc2a9235c0b202df47a2feecf6a8298d65bde70f"} Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.583113 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.586245 4795 patch_prober.go:28] interesting pod/route-controller-manager-b45454d48-2sjs7 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.586312 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.590327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" podStartSLOduration=14.590305821 podStartE2EDuration="14.590305821s" podCreationTimestamp="2026-03-10 15:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:31.589548969 +0000 UTC m=+264.755289867" watchObservedRunningTime="2026-03-10 15:10:31.590305821 +0000 UTC m=+264.756046719" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.595902 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nnfqt" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.596372 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-rltg9" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.596484 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.596498 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9fcm8" podUID="ae045732-f556-4808-bcd3-114aed4f8414" Mar 10 15:10:31 crc kubenswrapper[4795]: E0310 15:10:31.596618 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4jngq" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" Mar 10 15:10:31 crc kubenswrapper[4795]: I0310 15:10:31.617938 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" podStartSLOduration=15.617915332 podStartE2EDuration="15.617915332s" podCreationTimestamp="2026-03-10 15:10:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:31.605204238 +0000 UTC m=+264.770945146" watchObservedRunningTime="2026-03-10 15:10:31.617915332 +0000 UTC m=+264.783656240" Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.315516 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-vlsbk" Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.600758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zjg2f" event={"ID":"3036349b-f184-48aa-b5ab-de9c5c7ae511","Type":"ContainerStarted","Data":"26b6163ff3dfd5c2836503384cc7fd593671a0142acf1e3e33a074c93308cb24"} Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.604051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerStarted","Data":"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7"} Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.610724 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.612882 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.617248 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zjg2f" podStartSLOduration=197.617233834 podStartE2EDuration="3m17.617233834s" podCreationTimestamp="2026-03-10 15:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:32.614499686 +0000 UTC m=+265.780240594" watchObservedRunningTime="2026-03-10 15:10:32.617233834 +0000 UTC m=+265.782974742" Mar 10 15:10:32 crc kubenswrapper[4795]: I0310 15:10:32.633831 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zhm7h" podStartSLOduration=3.464418905 podStartE2EDuration="36.633814559s" podCreationTimestamp="2026-03-10 15:09:56 +0000 UTC" firstStartedPulling="2026-03-10 15:09:59.140385931 +0000 UTC m=+232.306126829" lastFinishedPulling="2026-03-10 15:10:32.309781585 +0000 UTC m=+265.475522483" observedRunningTime="2026-03-10 15:10:32.630968838 +0000 UTC m=+265.796709746" watchObservedRunningTime="2026-03-10 15:10:32.633814559 +0000 UTC m=+265.799555477" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.321094 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.323007 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.326349 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.330698 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.364169 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.427453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.427586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.528640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.528745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.528915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.552405 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.647551 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:34 crc kubenswrapper[4795]: I0310 15:10:34.950610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 10 15:10:35 crc kubenswrapper[4795]: I0310 15:10:35.622876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3cb56b1-4c6a-4318-85c8-4bab0f726336","Type":"ContainerStarted","Data":"21eb2f2b27680fb1f59cb747dfcd6957c4f3dd1289c430d6071a0e5c7e5f782a"} Mar 10 15:10:35 crc kubenswrapper[4795]: I0310 15:10:35.623382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3cb56b1-4c6a-4318-85c8-4bab0f726336","Type":"ContainerStarted","Data":"3f96ec3abd6e5fa3af9cb1562470a2e62213bdd117ff3e41f597e354799b6766"} Mar 10 15:10:35 crc kubenswrapper[4795]: I0310 15:10:35.650460 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.6504344899999999 podStartE2EDuration="1.65043449s" podCreationTimestamp="2026-03-10 15:10:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:35.641180234 +0000 UTC m=+268.806921152" watchObservedRunningTime="2026-03-10 15:10:35.65043449 +0000 UTC m=+268.816175408" Mar 10 15:10:36 crc kubenswrapper[4795]: I0310 15:10:36.633287 4795 generic.go:334] "Generic (PLEG): container finished" podID="c3cb56b1-4c6a-4318-85c8-4bab0f726336" containerID="21eb2f2b27680fb1f59cb747dfcd6957c4f3dd1289c430d6071a0e5c7e5f782a" exitCode=0 Mar 10 15:10:36 crc kubenswrapper[4795]: I0310 15:10:36.633534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3cb56b1-4c6a-4318-85c8-4bab0f726336","Type":"ContainerDied","Data":"21eb2f2b27680fb1f59cb747dfcd6957c4f3dd1289c430d6071a0e5c7e5f782a"} Mar 10 15:10:36 crc kubenswrapper[4795]: I0310 15:10:36.943041 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:36 crc kubenswrapper[4795]: I0310 15:10:36.943647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerName="controller-manager" containerID="cri-o://71de4758f103e87c98a9908eba7aa40039dc19309d831061d43f92ea9ff7495a" gracePeriod=30 Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.051652 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.051858 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerName="route-controller-manager" containerID="cri-o://fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc" gracePeriod=30 Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.380176 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.380480 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.410561 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.471564 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgpt2\" (UniqueName: \"kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2\") pod \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.471645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert\") pod \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.471694 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config\") pod \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.471714 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca\") pod \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\" (UID: \"71f0ba21-adcd-4f56-828a-07c814a6ea5c\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.472944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca" (OuterVolumeSpecName: "client-ca") pod "71f0ba21-adcd-4f56-828a-07c814a6ea5c" (UID: "71f0ba21-adcd-4f56-828a-07c814a6ea5c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.473000 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config" (OuterVolumeSpecName: "config") pod "71f0ba21-adcd-4f56-828a-07c814a6ea5c" (UID: "71f0ba21-adcd-4f56-828a-07c814a6ea5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.478020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2" (OuterVolumeSpecName: "kube-api-access-wgpt2") pod "71f0ba21-adcd-4f56-828a-07c814a6ea5c" (UID: "71f0ba21-adcd-4f56-828a-07c814a6ea5c"). InnerVolumeSpecName "kube-api-access-wgpt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.478089 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71f0ba21-adcd-4f56-828a-07c814a6ea5c" (UID: "71f0ba21-adcd-4f56-828a-07c814a6ea5c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.523270 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.573536 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgpt2\" (UniqueName: \"kubernetes.io/projected/71f0ba21-adcd-4f56-828a-07c814a6ea5c-kube-api-access-wgpt2\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.573565 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71f0ba21-adcd-4f56-828a-07c814a6ea5c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.573586 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.573600 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71f0ba21-adcd-4f56-828a-07c814a6ea5c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.640778 4795 generic.go:334] "Generic (PLEG): container finished" podID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerID="71de4758f103e87c98a9908eba7aa40039dc19309d831061d43f92ea9ff7495a" exitCode=0 Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.640863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" event={"ID":"fc134fad-dd1b-4b75-81d8-48fd32337219","Type":"ContainerDied","Data":"71de4758f103e87c98a9908eba7aa40039dc19309d831061d43f92ea9ff7495a"} Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.642164 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerStarted","Data":"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12"} Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.644624 4795 generic.go:334] "Generic (PLEG): container finished" podID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerID="fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc" exitCode=0 Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.644699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" event={"ID":"71f0ba21-adcd-4f56-828a-07c814a6ea5c","Type":"ContainerDied","Data":"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc"} Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.644728 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" event={"ID":"71f0ba21-adcd-4f56-828a-07c814a6ea5c","Type":"ContainerDied","Data":"8dde9e9361afd8ae70f446c1dc2a9235c0b202df47a2feecf6a8298d65bde70f"} Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.644771 4795 scope.go:117] "RemoveContainer" containerID="fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.645528 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.688416 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.713776 4795 scope.go:117] "RemoveContainer" containerID="fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc" Mar 10 15:10:37 crc kubenswrapper[4795]: E0310 15:10:37.714650 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc\": container with ID starting with fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc not found: ID does not exist" containerID="fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.714695 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc"} err="failed to get container status \"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc\": rpc error: code = NotFound desc = could not find container \"fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc\": container with ID starting with fd669562d0ad4fbfd63a88e08e3bb3e0622824b6424ad5616a2dbbd52d3657bc not found: ID does not exist" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.718848 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.721496 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b45454d48-2sjs7"] Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.899605 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.903931 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.977745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access\") pod \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert\") pod \"fc134fad-dd1b-4b75-81d8-48fd32337219\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978612 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6rfg\" (UniqueName: \"kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg\") pod \"fc134fad-dd1b-4b75-81d8-48fd32337219\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978650 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir\") pod \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\" (UID: \"c3cb56b1-4c6a-4318-85c8-4bab0f726336\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978704 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles\") pod \"fc134fad-dd1b-4b75-81d8-48fd32337219\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978736 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c3cb56b1-4c6a-4318-85c8-4bab0f726336" (UID: "c3cb56b1-4c6a-4318-85c8-4bab0f726336"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config\") pod \"fc134fad-dd1b-4b75-81d8-48fd32337219\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.978792 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca\") pod \"fc134fad-dd1b-4b75-81d8-48fd32337219\" (UID: \"fc134fad-dd1b-4b75-81d8-48fd32337219\") " Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.979200 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.979451 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca" (OuterVolumeSpecName: "client-ca") pod "fc134fad-dd1b-4b75-81d8-48fd32337219" (UID: "fc134fad-dd1b-4b75-81d8-48fd32337219"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.979462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fc134fad-dd1b-4b75-81d8-48fd32337219" (UID: "fc134fad-dd1b-4b75-81d8-48fd32337219"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.979495 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config" (OuterVolumeSpecName: "config") pod "fc134fad-dd1b-4b75-81d8-48fd32337219" (UID: "fc134fad-dd1b-4b75-81d8-48fd32337219"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.982231 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fc134fad-dd1b-4b75-81d8-48fd32337219" (UID: "fc134fad-dd1b-4b75-81d8-48fd32337219"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.982248 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c3cb56b1-4c6a-4318-85c8-4bab0f726336" (UID: "c3cb56b1-4c6a-4318-85c8-4bab0f726336"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:37 crc kubenswrapper[4795]: I0310 15:10:37.983050 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg" (OuterVolumeSpecName: "kube-api-access-b6rfg") pod "fc134fad-dd1b-4b75-81d8-48fd32337219" (UID: "fc134fad-dd1b-4b75-81d8-48fd32337219"). InnerVolumeSpecName "kube-api-access-b6rfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.079943 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.079978 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.079989 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc134fad-dd1b-4b75-81d8-48fd32337219-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.079998 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c3cb56b1-4c6a-4318-85c8-4bab0f726336-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.080009 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6rfg\" (UniqueName: \"kubernetes.io/projected/fc134fad-dd1b-4b75-81d8-48fd32337219-kube-api-access-b6rfg\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.080017 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc134fad-dd1b-4b75-81d8-48fd32337219-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.187890 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:38 crc kubenswrapper[4795]: E0310 15:10:38.188156 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerName="route-controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188174 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerName="route-controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: E0310 15:10:38.188189 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerName="controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188197 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerName="controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: E0310 15:10:38.188212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3cb56b1-4c6a-4318-85c8-4bab0f726336" containerName="pruner" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188221 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3cb56b1-4c6a-4318-85c8-4bab0f726336" containerName="pruner" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188337 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" containerName="route-controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188353 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3cb56b1-4c6a-4318-85c8-4bab0f726336" containerName="pruner" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188368 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" containerName="controller-manager" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.188770 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.193026 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.193665 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.195643 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.195709 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.196481 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.198092 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.198810 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.199041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.199275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.219737 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282715 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mhzh\" (UniqueName: \"kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282868 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282885 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcxtg\" (UniqueName: \"kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.282912 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384471 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcxtg\" (UniqueName: \"kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384547 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384607 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.384661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mhzh\" (UniqueName: \"kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.385576 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.386137 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.386746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.387138 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.387388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.389539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.389545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.402431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcxtg\" (UniqueName: \"kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg\") pod \"controller-manager-554678fb84-rznvn\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.408242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mhzh\" (UniqueName: \"kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh\") pod \"route-controller-manager-79768c7fcd-lbsjk\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.520021 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.530510 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.580345 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.733097 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerID="edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12" exitCode=0 Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.733181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerDied","Data":"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12"} Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.736355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"c3cb56b1-4c6a-4318-85c8-4bab0f726336","Type":"ContainerDied","Data":"3f96ec3abd6e5fa3af9cb1562470a2e62213bdd117ff3e41f597e354799b6766"} Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.736391 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f96ec3abd6e5fa3af9cb1562470a2e62213bdd117ff3e41f597e354799b6766" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.736445 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.750851 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.750888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74599d5d6b-sm6b6" event={"ID":"fc134fad-dd1b-4b75-81d8-48fd32337219","Type":"ContainerDied","Data":"63b6ac159f4af447d624b85dc5fc276869a683547e0cdc8522e59380a3f9d712"} Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.750920 4795 scope.go:117] "RemoveContainer" containerID="71de4758f103e87c98a9908eba7aa40039dc19309d831061d43f92ea9ff7495a" Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.783579 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:38 crc kubenswrapper[4795]: I0310 15:10:38.787103 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-74599d5d6b-sm6b6"] Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.033716 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.091968 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:39 crc kubenswrapper[4795]: W0310 15:10:39.101080 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7978f3c7_a22d_4b05_b5e3_f70f0314f580.slice/crio-f2d7be7b5b58b90814aed9f8603aa4ad7a806fa3d46dfa03c572b5ae50393bbb WatchSource:0}: Error finding container f2d7be7b5b58b90814aed9f8603aa4ad7a806fa3d46dfa03c572b5ae50393bbb: Status 404 returned error can't find the container with id f2d7be7b5b58b90814aed9f8603aa4ad7a806fa3d46dfa03c572b5ae50393bbb Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.305344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.306104 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.308943 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.309041 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.322994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.326911 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.326962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.326982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.428158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.428472 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.428496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.428563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.429102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.444616 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access\") pod \"installer-9-crc\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.483156 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71f0ba21-adcd-4f56-828a-07c814a6ea5c" path="/var/lib/kubelet/pods/71f0ba21-adcd-4f56-828a-07c814a6ea5c/volumes" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.483809 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc134fad-dd1b-4b75-81d8-48fd32337219" path="/var/lib/kubelet/pods/fc134fad-dd1b-4b75-81d8-48fd32337219/volumes" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.620572 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.760549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerStarted","Data":"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969"} Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.761833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" event={"ID":"cd1ee498-a4e9-430b-8441-725d972cbe1a","Type":"ContainerStarted","Data":"3b1f2afc1e2b6681163fba04b7dbb29577180e4d715c17e3ee2921ecd44112f2"} Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.761867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" event={"ID":"cd1ee498-a4e9-430b-8441-725d972cbe1a","Type":"ContainerStarted","Data":"ebcc3346ded691db591109b2cfc00f086ea6f7d99dacfcb5cd090a1f1ed547d7"} Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.762235 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.764049 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zhm7h" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="registry-server" containerID="cri-o://787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7" gracePeriod=2 Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.764302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" event={"ID":"7978f3c7-a22d-4b05-b5e3-f70f0314f580","Type":"ContainerStarted","Data":"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee"} Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.764321 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" event={"ID":"7978f3c7-a22d-4b05-b5e3-f70f0314f580","Type":"ContainerStarted","Data":"f2d7be7b5b58b90814aed9f8603aa4ad7a806fa3d46dfa03c572b5ae50393bbb"} Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.764712 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.780296 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.780477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.783691 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-47zhw" podStartSLOduration=3.7437257649999998 podStartE2EDuration="43.783681713s" podCreationTimestamp="2026-03-10 15:09:56 +0000 UTC" firstStartedPulling="2026-03-10 15:09:59.146483183 +0000 UTC m=+232.312224081" lastFinishedPulling="2026-03-10 15:10:39.186439121 +0000 UTC m=+272.352180029" observedRunningTime="2026-03-10 15:10:39.777935129 +0000 UTC m=+272.943676037" watchObservedRunningTime="2026-03-10 15:10:39.783681713 +0000 UTC m=+272.949422611" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.793779 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" podStartSLOduration=2.793759612 podStartE2EDuration="2.793759612s" podCreationTimestamp="2026-03-10 15:10:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:39.792201597 +0000 UTC m=+272.957942495" watchObservedRunningTime="2026-03-10 15:10:39.793759612 +0000 UTC m=+272.959500510" Mar 10 15:10:39 crc kubenswrapper[4795]: I0310 15:10:39.816149 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" podStartSLOduration=3.816132113 podStartE2EDuration="3.816132113s" podCreationTimestamp="2026-03-10 15:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:39.81045127 +0000 UTC m=+272.976192168" watchObservedRunningTime="2026-03-10 15:10:39.816132113 +0000 UTC m=+272.981873011" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.071810 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 10 15:10:40 crc kubenswrapper[4795]: W0310 15:10:40.077098 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod66d38c5d_4363_47e6_b417_89fef328eb00.slice/crio-83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41 WatchSource:0}: Error finding container 83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41: Status 404 returned error can't find the container with id 83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41 Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.159365 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.248591 4795 csr.go:261] certificate signing request csr-4nbfs is approved, waiting to be issued Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.251888 4795 csr.go:257] certificate signing request csr-4nbfs is issued Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.339783 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content\") pod \"a53e736d-c4d5-458a-940d-fd1a0719fc45\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.339855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities\") pod \"a53e736d-c4d5-458a-940d-fd1a0719fc45\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.339887 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khz2x\" (UniqueName: \"kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x\") pod \"a53e736d-c4d5-458a-940d-fd1a0719fc45\" (UID: \"a53e736d-c4d5-458a-940d-fd1a0719fc45\") " Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.340882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities" (OuterVolumeSpecName: "utilities") pod "a53e736d-c4d5-458a-940d-fd1a0719fc45" (UID: "a53e736d-c4d5-458a-940d-fd1a0719fc45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.354325 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x" (OuterVolumeSpecName: "kube-api-access-khz2x") pod "a53e736d-c4d5-458a-940d-fd1a0719fc45" (UID: "a53e736d-c4d5-458a-940d-fd1a0719fc45"). InnerVolumeSpecName "kube-api-access-khz2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.415701 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a53e736d-c4d5-458a-940d-fd1a0719fc45" (UID: "a53e736d-c4d5-458a-940d-fd1a0719fc45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.440799 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.440834 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a53e736d-c4d5-458a-940d-fd1a0719fc45-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.440843 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khz2x\" (UniqueName: \"kubernetes.io/projected/a53e736d-c4d5-458a-940d-fd1a0719fc45-kube-api-access-khz2x\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.771849 4795 generic.go:334] "Generic (PLEG): container finished" podID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerID="787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7" exitCode=0 Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.771941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerDied","Data":"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7"} Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.771981 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zhm7h" event={"ID":"a53e736d-c4d5-458a-940d-fd1a0719fc45","Type":"ContainerDied","Data":"b5026d29114e4ecbb0ac1a20cb69c0bce653a82afc55ab4d5383e1878aada4d7"} Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.772007 4795 scope.go:117] "RemoveContainer" containerID="787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.772240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zhm7h" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.781990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"66d38c5d-4363-47e6-b417-89fef328eb00","Type":"ContainerStarted","Data":"1f4b307f05c204595682fc9ef18a5460240896c35add3a4e969ec37aa6941396"} Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.782038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"66d38c5d-4363-47e6-b417-89fef328eb00","Type":"ContainerStarted","Data":"83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41"} Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.785347 4795 generic.go:334] "Generic (PLEG): container finished" podID="c868aa84-d232-4d80-bff3-d9e0aa659769" containerID="4a08a8498382e5fe80791676789605377d90f2720bba1f48f27e953514654dc4" exitCode=0 Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.785566 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" event={"ID":"c868aa84-d232-4d80-bff3-d9e0aa659769","Type":"ContainerDied","Data":"4a08a8498382e5fe80791676789605377d90f2720bba1f48f27e953514654dc4"} Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.798982 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.798968162 podStartE2EDuration="1.798968162s" podCreationTimestamp="2026-03-10 15:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:10:40.798162039 +0000 UTC m=+273.963902957" watchObservedRunningTime="2026-03-10 15:10:40.798968162 +0000 UTC m=+273.964709060" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.809708 4795 scope.go:117] "RemoveContainer" containerID="6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.827008 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.829676 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zhm7h"] Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.836760 4795 scope.go:117] "RemoveContainer" containerID="40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.851095 4795 scope.go:117] "RemoveContainer" containerID="787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7" Mar 10 15:10:40 crc kubenswrapper[4795]: E0310 15:10:40.851823 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7\": container with ID starting with 787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7 not found: ID does not exist" containerID="787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.851893 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7"} err="failed to get container status \"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7\": rpc error: code = NotFound desc = could not find container \"787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7\": container with ID starting with 787d2d30aa12d914f13e857f2da950701037dfefb219c031f9a1d0f55527e9b7 not found: ID does not exist" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.851925 4795 scope.go:117] "RemoveContainer" containerID="6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250" Mar 10 15:10:40 crc kubenswrapper[4795]: E0310 15:10:40.852343 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250\": container with ID starting with 6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250 not found: ID does not exist" containerID="6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.852365 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250"} err="failed to get container status \"6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250\": rpc error: code = NotFound desc = could not find container \"6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250\": container with ID starting with 6cff2223105eab6192e19d61d2176de1e3c5d1276aec2ddb0fc3521126476250 not found: ID does not exist" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.852379 4795 scope.go:117] "RemoveContainer" containerID="40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd" Mar 10 15:10:40 crc kubenswrapper[4795]: E0310 15:10:40.852758 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd\": container with ID starting with 40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd not found: ID does not exist" containerID="40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd" Mar 10 15:10:40 crc kubenswrapper[4795]: I0310 15:10:40.852787 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd"} err="failed to get container status \"40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd\": rpc error: code = NotFound desc = could not find container \"40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd\": container with ID starting with 40ccda6d4ffc222fd258fc09f5796cdc7f93b64498723c989580c2a84c17abdd not found: ID does not exist" Mar 10 15:10:41 crc kubenswrapper[4795]: I0310 15:10:41.252996 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-17 04:17:35.010444123 +0000 UTC Mar 10 15:10:41 crc kubenswrapper[4795]: I0310 15:10:41.253041 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7501h6m53.757406581s for next certificate rotation Mar 10 15:10:41 crc kubenswrapper[4795]: I0310 15:10:41.484114 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" path="/var/lib/kubelet/pods/a53e736d-c4d5-458a-940d-fd1a0719fc45/volumes" Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.049811 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.165603 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rr26\" (UniqueName: \"kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26\") pod \"c868aa84-d232-4d80-bff3-d9e0aa659769\" (UID: \"c868aa84-d232-4d80-bff3-d9e0aa659769\") " Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.171792 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26" (OuterVolumeSpecName: "kube-api-access-9rr26") pod "c868aa84-d232-4d80-bff3-d9e0aa659769" (UID: "c868aa84-d232-4d80-bff3-d9e0aa659769"). InnerVolumeSpecName "kube-api-access-9rr26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.253914 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-05 17:20:14.983114452 +0000 UTC Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.253950 4795 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6482h9m32.729167322s for next certificate rotation Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.266702 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rr26\" (UniqueName: \"kubernetes.io/projected/c868aa84-d232-4d80-bff3-d9e0aa659769-kube-api-access-9rr26\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.803391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" event={"ID":"c868aa84-d232-4d80-bff3-d9e0aa659769","Type":"ContainerDied","Data":"e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e"} Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.803679 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7a83bbab98b9229068c3d8abc29f8bf4dbfb8f0f084077b749155c70ef8013e" Mar 10 15:10:42 crc kubenswrapper[4795]: I0310 15:10:42.803499 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552588-zdmvt" Mar 10 15:10:47 crc kubenswrapper[4795]: I0310 15:10:47.409283 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:47 crc kubenswrapper[4795]: I0310 15:10:47.409364 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:47 crc kubenswrapper[4795]: I0310 15:10:47.471777 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:47 crc kubenswrapper[4795]: I0310 15:10:47.874629 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.380133 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.539809 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.539896 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.539971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.540821 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.540906 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51" gracePeriod=600 Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.831796 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51" exitCode=0 Mar 10 15:10:48 crc kubenswrapper[4795]: I0310 15:10:48.831948 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.812598 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.847607 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerStarted","Data":"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.849808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerStarted","Data":"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.852288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.854442 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerStarted","Data":"06c430e202d5b7a2f5b14068a101502b10ae8b9e86f17bad0f541f2cb55c85d7"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.856099 4795 generic.go:334] "Generic (PLEG): container finished" podID="ae045732-f556-4808-bcd3-114aed4f8414" containerID="a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d" exitCode=0 Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.856168 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerDied","Data":"a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.857935 4795 generic.go:334] "Generic (PLEG): container finished" podID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerID="a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c" exitCode=0 Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.858012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerDied","Data":"a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c"} Mar 10 15:10:49 crc kubenswrapper[4795]: I0310 15:10:49.858153 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-47zhw" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="registry-server" containerID="cri-o://0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969" gracePeriod=2 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.768577 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.779324 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m6ts\" (UniqueName: \"kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts\") pod \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.779434 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities\") pod \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.779468 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content\") pod \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\" (UID: \"2ead40a3-f58b-4385-9cad-7a1d04229bb3\") " Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.781895 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities" (OuterVolumeSpecName: "utilities") pod "2ead40a3-f58b-4385-9cad-7a1d04229bb3" (UID: "2ead40a3-f58b-4385-9cad-7a1d04229bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.786083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts" (OuterVolumeSpecName: "kube-api-access-6m6ts") pod "2ead40a3-f58b-4385-9cad-7a1d04229bb3" (UID: "2ead40a3-f58b-4385-9cad-7a1d04229bb3"). InnerVolumeSpecName "kube-api-access-6m6ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.844440 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ead40a3-f58b-4385-9cad-7a1d04229bb3" (UID: "2ead40a3-f58b-4385-9cad-7a1d04229bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.869505 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerID="0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.869773 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerDied","Data":"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.869808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-47zhw" event={"ID":"2ead40a3-f58b-4385-9cad-7a1d04229bb3","Type":"ContainerDied","Data":"9e5d6fbff173297531524291190a580facfd00e037a31cfee291ea277c01275d"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.869829 4795 scope.go:117] "RemoveContainer" containerID="0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.870174 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-47zhw" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.874219 4795 generic.go:334] "Generic (PLEG): container finished" podID="4d15efc4-4cee-459e-b10d-e0452d172fc7" containerID="1573f4494b45acf52f25b49054853463a851d43ab58114956372b485cc5c25a5" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.874300 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" event={"ID":"4d15efc4-4cee-459e-b10d-e0452d172fc7","Type":"ContainerDied","Data":"1573f4494b45acf52f25b49054853463a851d43ab58114956372b485cc5c25a5"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.876475 4795 generic.go:334] "Generic (PLEG): container finished" podID="88cbf782-9591-4d58-8a30-e4b145687dac" containerID="06c430e202d5b7a2f5b14068a101502b10ae8b9e86f17bad0f541f2cb55c85d7" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.876536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerDied","Data":"06c430e202d5b7a2f5b14068a101502b10ae8b9e86f17bad0f541f2cb55c85d7"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.881420 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m6ts\" (UniqueName: \"kubernetes.io/projected/2ead40a3-f58b-4385-9cad-7a1d04229bb3-kube-api-access-6m6ts\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.881440 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.881450 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ead40a3-f58b-4385-9cad-7a1d04229bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.882410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerStarted","Data":"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.889785 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerStarted","Data":"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.892549 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerID="b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.892639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerDied","Data":"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.894992 4795 scope.go:117] "RemoveContainer" containerID="edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.895377 4795 generic.go:334] "Generic (PLEG): container finished" podID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerID="8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.895471 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerDied","Data":"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.906490 4795 generic.go:334] "Generic (PLEG): container finished" podID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerID="16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee" exitCode=0 Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.906626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerDied","Data":"16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee"} Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.925434 4795 scope.go:117] "RemoveContainer" containerID="395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.934542 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9fcm8" podStartSLOduration=2.811811884 podStartE2EDuration="52.934520931s" podCreationTimestamp="2026-03-10 15:09:58 +0000 UTC" firstStartedPulling="2026-03-10 15:10:00.259294673 +0000 UTC m=+233.425035571" lastFinishedPulling="2026-03-10 15:10:50.38200372 +0000 UTC m=+283.547744618" observedRunningTime="2026-03-10 15:10:50.929657491 +0000 UTC m=+284.095398389" watchObservedRunningTime="2026-03-10 15:10:50.934520931 +0000 UTC m=+284.100261839" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.942196 4795 scope.go:117] "RemoveContainer" containerID="0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969" Mar 10 15:10:50 crc kubenswrapper[4795]: E0310 15:10:50.943604 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969\": container with ID starting with 0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969 not found: ID does not exist" containerID="0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.943637 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969"} err="failed to get container status \"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969\": rpc error: code = NotFound desc = could not find container \"0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969\": container with ID starting with 0e4f3c41502c1f484a5e42b984a3fc0b92a3f77e40f8a96d2c07fb59e7ecd969 not found: ID does not exist" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.943663 4795 scope.go:117] "RemoveContainer" containerID="edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12" Mar 10 15:10:50 crc kubenswrapper[4795]: E0310 15:10:50.944084 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12\": container with ID starting with edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12 not found: ID does not exist" containerID="edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.944115 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12"} err="failed to get container status \"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12\": rpc error: code = NotFound desc = could not find container \"edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12\": container with ID starting with edb84f9d290d49042fbb48701871090b658265931b7377ee07361a875e764a12 not found: ID does not exist" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.944130 4795 scope.go:117] "RemoveContainer" containerID="395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958" Mar 10 15:10:50 crc kubenswrapper[4795]: E0310 15:10:50.946168 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958\": container with ID starting with 395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958 not found: ID does not exist" containerID="395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.946197 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958"} err="failed to get container status \"395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958\": rpc error: code = NotFound desc = could not find container \"395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958\": container with ID starting with 395efe7bd1f936374a7092daa0cf2f1f04d3442fb161f6e3986107cb91c0a958 not found: ID does not exist" Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.978015 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:10:50 crc kubenswrapper[4795]: I0310 15:10:50.982327 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-47zhw"] Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.015271 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4jngq" podStartSLOduration=2.792496833 podStartE2EDuration="53.015251474s" podCreationTimestamp="2026-03-10 15:09:58 +0000 UTC" firstStartedPulling="2026-03-10 15:10:00.195399995 +0000 UTC m=+233.361140893" lastFinishedPulling="2026-03-10 15:10:50.418154636 +0000 UTC m=+283.583895534" observedRunningTime="2026-03-10 15:10:51.014887733 +0000 UTC m=+284.180628631" watchObservedRunningTime="2026-03-10 15:10:51.015251474 +0000 UTC m=+284.180992372" Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.487140 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" path="/var/lib/kubelet/pods/2ead40a3-f58b-4385-9cad-7a1d04229bb3/volumes" Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.916813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerStarted","Data":"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d"} Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.921134 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerStarted","Data":"ba7fc331e481c14e7385985d46157303109210194c4f5a95c3bd9a2cd9c4b8a6"} Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.923274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerStarted","Data":"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27"} Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.925576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerStarted","Data":"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7"} Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.975739 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nnfqt" podStartSLOduration=2.551930846 podStartE2EDuration="55.975717692s" podCreationTimestamp="2026-03-10 15:09:56 +0000 UTC" firstStartedPulling="2026-03-10 15:09:58.022392204 +0000 UTC m=+231.188133102" lastFinishedPulling="2026-03-10 15:10:51.44617905 +0000 UTC m=+284.611919948" observedRunningTime="2026-03-10 15:10:51.948600995 +0000 UTC m=+285.114341893" watchObservedRunningTime="2026-03-10 15:10:51.975717692 +0000 UTC m=+285.141458600" Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.977237 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kldkn" podStartSLOduration=3.680754193 podStartE2EDuration="55.977231095s" podCreationTimestamp="2026-03-10 15:09:56 +0000 UTC" firstStartedPulling="2026-03-10 15:09:59.12222737 +0000 UTC m=+232.287968268" lastFinishedPulling="2026-03-10 15:10:51.418704272 +0000 UTC m=+284.584445170" observedRunningTime="2026-03-10 15:10:51.973500058 +0000 UTC m=+285.139240956" watchObservedRunningTime="2026-03-10 15:10:51.977231095 +0000 UTC m=+285.142971993" Mar 10 15:10:51 crc kubenswrapper[4795]: I0310 15:10:51.996719 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rltg9" podStartSLOduration=2.813661197 podStartE2EDuration="52.996700483s" podCreationTimestamp="2026-03-10 15:09:59 +0000 UTC" firstStartedPulling="2026-03-10 15:10:01.286114032 +0000 UTC m=+234.451854930" lastFinishedPulling="2026-03-10 15:10:51.469153318 +0000 UTC m=+284.634894216" observedRunningTime="2026-03-10 15:10:51.995758406 +0000 UTC m=+285.161499304" watchObservedRunningTime="2026-03-10 15:10:51.996700483 +0000 UTC m=+285.162441381" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.016331 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gchlt" podStartSLOduration=3.047235078 podStartE2EDuration="53.016313525s" podCreationTimestamp="2026-03-10 15:09:59 +0000 UTC" firstStartedPulling="2026-03-10 15:10:01.349751344 +0000 UTC m=+234.515492242" lastFinishedPulling="2026-03-10 15:10:51.318829791 +0000 UTC m=+284.484570689" observedRunningTime="2026-03-10 15:10:52.015026008 +0000 UTC m=+285.180766906" watchObservedRunningTime="2026-03-10 15:10:52.016313525 +0000 UTC m=+285.182054423" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.321983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.500198 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27lt4\" (UniqueName: \"kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4\") pod \"4d15efc4-4cee-459e-b10d-e0452d172fc7\" (UID: \"4d15efc4-4cee-459e-b10d-e0452d172fc7\") " Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.508290 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4" (OuterVolumeSpecName: "kube-api-access-27lt4") pod "4d15efc4-4cee-459e-b10d-e0452d172fc7" (UID: "4d15efc4-4cee-459e-b10d-e0452d172fc7"). InnerVolumeSpecName "kube-api-access-27lt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.601944 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27lt4\" (UniqueName: \"kubernetes.io/projected/4d15efc4-4cee-459e-b10d-e0452d172fc7-kube-api-access-27lt4\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.931458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" event={"ID":"4d15efc4-4cee-459e-b10d-e0452d172fc7","Type":"ContainerDied","Data":"1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050"} Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.931506 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f4222e9fff886ed613d0e48cd1c99d579d951b98fc774cd29e8a54331042050" Mar 10 15:10:52 crc kubenswrapper[4795]: I0310 15:10:52.931506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552590-2x5qv" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.720297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.720641 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.783512 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.925050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.925131 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.979322 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.979769 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" podUID="cd1ee498-a4e9-430b-8441-725d972cbe1a" containerName="controller-manager" containerID="cri-o://3b1f2afc1e2b6681163fba04b7dbb29577180e4d715c17e3ee2921ecd44112f2" gracePeriod=30 Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.986013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.986276 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" podUID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" containerName="route-controller-manager" containerID="cri-o://da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee" gracePeriod=30 Mar 10 15:10:56 crc kubenswrapper[4795]: I0310 15:10:56.997201 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.032688 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.061577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.905675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.964436 4795 generic.go:334] "Generic (PLEG): container finished" podID="cd1ee498-a4e9-430b-8441-725d972cbe1a" containerID="3b1f2afc1e2b6681163fba04b7dbb29577180e4d715c17e3ee2921ecd44112f2" exitCode=0 Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.964534 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" event={"ID":"cd1ee498-a4e9-430b-8441-725d972cbe1a","Type":"ContainerDied","Data":"3b1f2afc1e2b6681163fba04b7dbb29577180e4d715c17e3ee2921ecd44112f2"} Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.970588 4795 generic.go:334] "Generic (PLEG): container finished" podID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" containerID="da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee" exitCode=0 Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.970682 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.970753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" event={"ID":"7978f3c7-a22d-4b05-b5e3-f70f0314f580","Type":"ContainerDied","Data":"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee"} Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.970793 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk" event={"ID":"7978f3c7-a22d-4b05-b5e3-f70f0314f580","Type":"ContainerDied","Data":"f2d7be7b5b58b90814aed9f8603aa4ad7a806fa3d46dfa03c572b5ae50393bbb"} Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.970813 4795 scope.go:117] "RemoveContainer" containerID="da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.998772 4795 scope.go:117] "RemoveContainer" containerID="da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee" Mar 10 15:10:57 crc kubenswrapper[4795]: E0310 15:10:57.999365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee\": container with ID starting with da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee not found: ID does not exist" containerID="da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee" Mar 10 15:10:57 crc kubenswrapper[4795]: I0310 15:10:57.999395 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee"} err="failed to get container status \"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee\": rpc error: code = NotFound desc = could not find container \"da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee\": container with ID starting with da2b4616e509af2211e1f078342dc42a1dc783d19cb5056fbc59e52cd30a91ee not found: ID does not exist" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.083477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca\") pod \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.083539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config\") pod \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.083565 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mhzh\" (UniqueName: \"kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh\") pod \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.083630 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert\") pod \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\" (UID: \"7978f3c7-a22d-4b05-b5e3-f70f0314f580\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.084660 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config" (OuterVolumeSpecName: "config") pod "7978f3c7-a22d-4b05-b5e3-f70f0314f580" (UID: "7978f3c7-a22d-4b05-b5e3-f70f0314f580"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.084681 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca" (OuterVolumeSpecName: "client-ca") pod "7978f3c7-a22d-4b05-b5e3-f70f0314f580" (UID: "7978f3c7-a22d-4b05-b5e3-f70f0314f580"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.095332 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh" (OuterVolumeSpecName: "kube-api-access-2mhzh") pod "7978f3c7-a22d-4b05-b5e3-f70f0314f580" (UID: "7978f3c7-a22d-4b05-b5e3-f70f0314f580"). InnerVolumeSpecName "kube-api-access-2mhzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.095343 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7978f3c7-a22d-4b05-b5e3-f70f0314f580" (UID: "7978f3c7-a22d-4b05-b5e3-f70f0314f580"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.142092 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.185462 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7978f3c7-a22d-4b05-b5e3-f70f0314f580-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.185495 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.185506 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7978f3c7-a22d-4b05-b5e3-f70f0314f580-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.185528 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mhzh\" (UniqueName: \"kubernetes.io/projected/7978f3c7-a22d-4b05-b5e3-f70f0314f580-kube-api-access-2mhzh\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207118 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p"] Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="extract-utilities" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207378 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="extract-utilities" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207388 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" containerName="route-controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207394 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" containerName="route-controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207410 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207416 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207425 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="extract-content" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207431 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="extract-content" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207439 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd1ee498-a4e9-430b-8441-725d972cbe1a" containerName="controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207445 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd1ee498-a4e9-430b-8441-725d972cbe1a" containerName="controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207450 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207457 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207465 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="extract-content" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207471 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="extract-content" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207479 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207495 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207500 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: E0310 15:10:58.207512 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="extract-utilities" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207518 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="extract-utilities" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207613 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd1ee498-a4e9-430b-8441-725d972cbe1a" containerName="controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207625 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ead40a3-f58b-4385-9cad-7a1d04229bb3" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207633 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" containerName="route-controller-manager" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207641 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207648 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" containerName="oc" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.207657 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53e736d-c4d5-458a-940d-fd1a0719fc45" containerName="registry-server" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.208026 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.212109 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr"] Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.212795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.216467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p"] Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.232677 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr"] Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles\") pod \"cd1ee498-a4e9-430b-8441-725d972cbe1a\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286103 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca\") pod \"cd1ee498-a4e9-430b-8441-725d972cbe1a\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config\") pod \"cd1ee498-a4e9-430b-8441-725d972cbe1a\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286558 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcxtg\" (UniqueName: \"kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg\") pod \"cd1ee498-a4e9-430b-8441-725d972cbe1a\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286580 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert\") pod \"cd1ee498-a4e9-430b-8441-725d972cbe1a\" (UID: \"cd1ee498-a4e9-430b-8441-725d972cbe1a\") " Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286690 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286753 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgk76\" (UniqueName: \"kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286766 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd1ee498-a4e9-430b-8441-725d972cbe1a" (UID: "cd1ee498-a4e9-430b-8441-725d972cbe1a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286776 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286804 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cd1ee498-a4e9-430b-8441-725d972cbe1a" (UID: "cd1ee498-a4e9-430b-8441-725d972cbe1a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkhmh\" (UniqueName: \"kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286954 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.286976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.287005 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.287016 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.287310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config" (OuterVolumeSpecName: "config") pod "cd1ee498-a4e9-430b-8441-725d972cbe1a" (UID: "cd1ee498-a4e9-430b-8441-725d972cbe1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.290528 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd1ee498-a4e9-430b-8441-725d972cbe1a" (UID: "cd1ee498-a4e9-430b-8441-725d972cbe1a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.290556 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg" (OuterVolumeSpecName: "kube-api-access-vcxtg") pod "cd1ee498-a4e9-430b-8441-725d972cbe1a" (UID: "cd1ee498-a4e9-430b-8441-725d972cbe1a"). InnerVolumeSpecName "kube-api-access-vcxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.308445 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.318621 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-79768c7fcd-lbsjk"] Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387778 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgk76\" (UniqueName: \"kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387798 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkhmh\" (UniqueName: \"kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387895 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387945 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd1ee498-a4e9-430b-8441-725d972cbe1a-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387956 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcxtg\" (UniqueName: \"kubernetes.io/projected/cd1ee498-a4e9-430b-8441-725d972cbe1a-kube-api-access-vcxtg\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.387966 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd1ee498-a4e9-430b-8441-725d972cbe1a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.388719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.388738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.389514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.390265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.390726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.391153 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.393118 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.412706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgk76\" (UniqueName: \"kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76\") pod \"controller-manager-6cf8c4b57b-t9j9p\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.415169 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkhmh\" (UniqueName: \"kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh\") pod \"route-controller-manager-9bd6b545c-stvtr\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.534531 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.542591 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.907846 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.907958 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.947972 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr"] Mar 10 15:10:58 crc kubenswrapper[4795]: W0310 15:10:58.961104 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf883be8f_2258_47fe_b609_768f8b0a50c7.slice/crio-34f056ae47097e565de221a5dd783d1c5744dff46aefce88a234caf06ea93387 WatchSource:0}: Error finding container 34f056ae47097e565de221a5dd783d1c5744dff46aefce88a234caf06ea93387: Status 404 returned error can't find the container with id 34f056ae47097e565de221a5dd783d1c5744dff46aefce88a234caf06ea93387 Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.963868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.983225 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" event={"ID":"cd1ee498-a4e9-430b-8441-725d972cbe1a","Type":"ContainerDied","Data":"ebcc3346ded691db591109b2cfc00f086ea6f7d99dacfcb5cd090a1f1ed547d7"} Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.983283 4795 scope.go:117] "RemoveContainer" containerID="3b1f2afc1e2b6681163fba04b7dbb29577180e4d715c17e3ee2921ecd44112f2" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.983501 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-554678fb84-rznvn" Mar 10 15:10:58 crc kubenswrapper[4795]: I0310 15:10:58.999775 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p"] Mar 10 15:10:59 crc kubenswrapper[4795]: W0310 15:10:59.011613 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdf041e_d9fa_42ac_875f_c54406e67281.slice/crio-0ef2babcd778479215f80f60b6fe50e3fa736d2c3baa09bd433c41399a82db29 WatchSource:0}: Error finding container 0ef2babcd778479215f80f60b6fe50e3fa736d2c3baa09bd433c41399a82db29: Status 404 returned error can't find the container with id 0ef2babcd778479215f80f60b6fe50e3fa736d2c3baa09bd433c41399a82db29 Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.023280 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.027837 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-554678fb84-rznvn"] Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.032041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.326333 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.326689 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.391040 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.485660 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7978f3c7-a22d-4b05-b5e3-f70f0314f580" path="/var/lib/kubelet/pods/7978f3c7-a22d-4b05-b5e3-f70f0314f580/volumes" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.486368 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd1ee498-a4e9-430b-8441-725d972cbe1a" path="/var/lib/kubelet/pods/cd1ee498-a4e9-430b-8441-725d972cbe1a/volumes" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.895172 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.895230 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.948814 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.991303 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" event={"ID":"f883be8f-2258-47fe-b609-768f8b0a50c7","Type":"ContainerStarted","Data":"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2"} Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.991349 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" event={"ID":"f883be8f-2258-47fe-b609-768f8b0a50c7","Type":"ContainerStarted","Data":"34f056ae47097e565de221a5dd783d1c5744dff46aefce88a234caf06ea93387"} Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.991595 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.993391 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" event={"ID":"9cdf041e-d9fa-42ac-875f-c54406e67281","Type":"ContainerStarted","Data":"9fa95e8f9014e4a6bb7883e99a8c76601ddf618e3ebfb0b5ea31aa8b36481d59"} Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.993422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" event={"ID":"9cdf041e-d9fa-42ac-875f-c54406e67281","Type":"ContainerStarted","Data":"0ef2babcd778479215f80f60b6fe50e3fa736d2c3baa09bd433c41399a82db29"} Mar 10 15:10:59 crc kubenswrapper[4795]: I0310 15:10:59.997577 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.010400 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" podStartSLOduration=3.010385127 podStartE2EDuration="3.010385127s" podCreationTimestamp="2026-03-10 15:10:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:00.00770512 +0000 UTC m=+293.173446018" watchObservedRunningTime="2026-03-10 15:11:00.010385127 +0000 UTC m=+293.176126025" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.029801 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.041988 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.047369 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" podStartSLOduration=4.047348296 podStartE2EDuration="4.047348296s" podCreationTimestamp="2026-03-10 15:10:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:00.046974485 +0000 UTC m=+293.212715383" watchObservedRunningTime="2026-03-10 15:11:00.047348296 +0000 UTC m=+293.213089204" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.311370 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.311778 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:00 crc kubenswrapper[4795]: I0310 15:11:00.352140 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:01 crc kubenswrapper[4795]: I0310 15:11:01.000615 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:11:01 crc kubenswrapper[4795]: I0310 15:11:01.006894 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:11:01 crc kubenswrapper[4795]: I0310 15:11:01.064572 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.382025 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.382292 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4jngq" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="registry-server" containerID="cri-o://ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811" gracePeriod=2 Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.840550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.943964 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content\") pod \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.944027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities\") pod \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.944093 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g57p7\" (UniqueName: \"kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7\") pod \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\" (UID: \"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f\") " Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.944900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities" (OuterVolumeSpecName: "utilities") pod "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" (UID: "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.945382 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.951357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7" (OuterVolumeSpecName: "kube-api-access-g57p7") pod "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" (UID: "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f"). InnerVolumeSpecName "kube-api-access-g57p7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:02 crc kubenswrapper[4795]: I0310 15:11:02.993236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" (UID: "af8bfe8e-d848-4c0c-954d-6eadb84fbe0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.017035 4795 generic.go:334] "Generic (PLEG): container finished" podID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerID="ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811" exitCode=0 Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.017125 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerDied","Data":"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811"} Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.017152 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4jngq" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.017182 4795 scope.go:117] "RemoveContainer" containerID="ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.017167 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4jngq" event={"ID":"af8bfe8e-d848-4c0c-954d-6eadb84fbe0f","Type":"ContainerDied","Data":"fc09af7ff195160ae5b7cd61ab03dff0661e1d6d0f144404a93c11393d415a42"} Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.052634 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.052895 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g57p7\" (UniqueName: \"kubernetes.io/projected/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f-kube-api-access-g57p7\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.055850 4795 scope.go:117] "RemoveContainer" containerID="a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.064438 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.070211 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4jngq"] Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.077323 4795 scope.go:117] "RemoveContainer" containerID="6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.102977 4795 scope.go:117] "RemoveContainer" containerID="ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811" Mar 10 15:11:03 crc kubenswrapper[4795]: E0310 15:11:03.103437 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811\": container with ID starting with ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811 not found: ID does not exist" containerID="ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.103507 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811"} err="failed to get container status \"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811\": rpc error: code = NotFound desc = could not find container \"ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811\": container with ID starting with ccb404536c97caed1796492ba866cbb905c325e218535ea97e59338a722b2811 not found: ID does not exist" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.103535 4795 scope.go:117] "RemoveContainer" containerID="a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c" Mar 10 15:11:03 crc kubenswrapper[4795]: E0310 15:11:03.103885 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c\": container with ID starting with a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c not found: ID does not exist" containerID="a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.103942 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c"} err="failed to get container status \"a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c\": rpc error: code = NotFound desc = could not find container \"a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c\": container with ID starting with a99d28958512fc16ca68a86909411bbb8524ad9052bcbe69b0cc2b358e59f23c not found: ID does not exist" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.103974 4795 scope.go:117] "RemoveContainer" containerID="6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec" Mar 10 15:11:03 crc kubenswrapper[4795]: E0310 15:11:03.104336 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec\": container with ID starting with 6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec not found: ID does not exist" containerID="6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.104361 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec"} err="failed to get container status \"6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec\": rpc error: code = NotFound desc = could not find container \"6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec\": container with ID starting with 6754c2c744d50fc4fc9ee9d3b8622e3f08ef9467543e8a80bd2e176f6bdcefec not found: ID does not exist" Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.379408 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:11:03 crc kubenswrapper[4795]: I0310 15:11:03.481854 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" path="/var/lib/kubelet/pods/af8bfe8e-d848-4c0c-954d-6eadb84fbe0f/volumes" Mar 10 15:11:04 crc kubenswrapper[4795]: I0310 15:11:04.022336 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gchlt" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="registry-server" containerID="cri-o://ba7fc331e481c14e7385985d46157303109210194c4f5a95c3bd9a2cd9c4b8a6" gracePeriod=2 Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.028131 4795 generic.go:334] "Generic (PLEG): container finished" podID="88cbf782-9591-4d58-8a30-e4b145687dac" containerID="ba7fc331e481c14e7385985d46157303109210194c4f5a95c3bd9a2cd9c4b8a6" exitCode=0 Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.028194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerDied","Data":"ba7fc331e481c14e7385985d46157303109210194c4f5a95c3bd9a2cd9c4b8a6"} Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.336874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.386459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content\") pod \"88cbf782-9591-4d58-8a30-e4b145687dac\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.386583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities\") pod \"88cbf782-9591-4d58-8a30-e4b145687dac\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.386652 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgszd\" (UniqueName: \"kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd\") pod \"88cbf782-9591-4d58-8a30-e4b145687dac\" (UID: \"88cbf782-9591-4d58-8a30-e4b145687dac\") " Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.387530 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities" (OuterVolumeSpecName: "utilities") pod "88cbf782-9591-4d58-8a30-e4b145687dac" (UID: "88cbf782-9591-4d58-8a30-e4b145687dac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.390883 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd" (OuterVolumeSpecName: "kube-api-access-dgszd") pod "88cbf782-9591-4d58-8a30-e4b145687dac" (UID: "88cbf782-9591-4d58-8a30-e4b145687dac"). InnerVolumeSpecName "kube-api-access-dgszd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.487984 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.488013 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgszd\" (UniqueName: \"kubernetes.io/projected/88cbf782-9591-4d58-8a30-e4b145687dac-kube-api-access-dgszd\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.517132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "88cbf782-9591-4d58-8a30-e4b145687dac" (UID: "88cbf782-9591-4d58-8a30-e4b145687dac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:11:05 crc kubenswrapper[4795]: I0310 15:11:05.588997 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/88cbf782-9591-4d58-8a30-e4b145687dac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.037190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gchlt" event={"ID":"88cbf782-9591-4d58-8a30-e4b145687dac","Type":"ContainerDied","Data":"12a8f365eaccbfd76231de0f05a94cdd1b72108c6b2e38e7ab31e29382f9cb9f"} Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.037263 4795 scope.go:117] "RemoveContainer" containerID="ba7fc331e481c14e7385985d46157303109210194c4f5a95c3bd9a2cd9c4b8a6" Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.037331 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gchlt" Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.064187 4795 scope.go:117] "RemoveContainer" containerID="06c430e202d5b7a2f5b14068a101502b10ae8b9e86f17bad0f541f2cb55c85d7" Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.081589 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.084633 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gchlt"] Mar 10 15:11:06 crc kubenswrapper[4795]: I0310 15:11:06.102225 4795 scope.go:117] "RemoveContainer" containerID="178367b6e19db31876f17fa10ae59fa3591b01ad4dd32bb542a95c7e56ad7eb2" Mar 10 15:11:07 crc kubenswrapper[4795]: I0310 15:11:07.489030 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" path="/var/lib/kubelet/pods/88cbf782-9591-4d58-8a30-e4b145687dac/volumes" Mar 10 15:11:14 crc kubenswrapper[4795]: I0310 15:11:14.834341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerName="oauth-openshift" containerID="cri-o://496692d33a551c0861b47dca7a221cd22241563427cc3b4d7b89f88fe3a358d4" gracePeriod=15 Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.108762 4795 generic.go:334] "Generic (PLEG): container finished" podID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerID="496692d33a551c0861b47dca7a221cd22241563427cc3b4d7b89f88fe3a358d4" exitCode=0 Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.108931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" event={"ID":"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb","Type":"ContainerDied","Data":"496692d33a551c0861b47dca7a221cd22241563427cc3b4d7b89f88fe3a358d4"} Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.378871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428076 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428180 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428212 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428254 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428283 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428332 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts78r\" (UniqueName: \"kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428369 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428400 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert\") pod \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\" (UID: \"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb\") " Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.428680 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.429010 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.429032 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.429136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.429548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.448832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.449098 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.449386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.449448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r" (OuterVolumeSpecName: "kube-api-access-ts78r") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "kube-api-access-ts78r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.449797 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.450174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.450472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.451382 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.451442 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" (UID: "194b0dd6-bbc0-4b57-b94b-89e4e46d87fb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529489 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529619 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529641 4795 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529653 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529664 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529674 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529684 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529695 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts78r\" (UniqueName: \"kubernetes.io/projected/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-kube-api-access-ts78r\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529704 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529714 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529721 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529731 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529739 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:15 crc kubenswrapper[4795]: I0310 15:11:15.529753 4795 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.118242 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" event={"ID":"194b0dd6-bbc0-4b57-b94b-89e4e46d87fb","Type":"ContainerDied","Data":"5a9adaadccc1875c14f83e09f63c16fed8449734521831b0edb92b3fa2ea4cd3"} Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.118324 4795 scope.go:117] "RemoveContainer" containerID="496692d33a551c0861b47dca7a221cd22241563427cc3b4d7b89f88fe3a358d4" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.118489 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-jd6qf" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.148567 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.157696 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-jd6qf"] Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224242 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-78c4665959-52m7f"] Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224530 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224558 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224577 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="extract-content" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224587 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="extract-content" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224603 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerName="oauth-openshift" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224613 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerName="oauth-openshift" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224630 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="extract-content" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224642 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="extract-content" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224658 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="extract-utilities" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224668 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="extract-utilities" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224684 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="extract-utilities" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224694 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="extract-utilities" Mar 10 15:11:16 crc kubenswrapper[4795]: E0310 15:11:16.224705 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224715 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224868 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="af8bfe8e-d848-4c0c-954d-6eadb84fbe0f" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224885 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" containerName="oauth-openshift" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.224906 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="88cbf782-9591-4d58-8a30-e4b145687dac" containerName="registry-server" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.225463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231330 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231379 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231420 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231465 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231482 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231601 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231681 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.231802 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.232557 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.236253 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.239511 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.243051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-78c4665959-52m7f"] Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.290259 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.291363 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.296709 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-login\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343159 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-policies\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-session\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343232 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txb5s\" (UniqueName: \"kubernetes.io/projected/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-kube-api-access-txb5s\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343517 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343541 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-dir\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-service-ca\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343720 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-router-certs\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.343746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-error\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446517 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txb5s\" (UniqueName: \"kubernetes.io/projected/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-kube-api-access-txb5s\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446696 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446765 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-dir\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.446923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-dir\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447008 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447158 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-service-ca\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-router-certs\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-error\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-login\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-policies\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-session\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.447677 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.448428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-service-ca\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.448568 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.448941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-audit-policies\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.451657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.451850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.453389 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.453439 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.453686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-router-certs\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.455584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-system-session\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.455601 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-error\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.455776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-v4-0-config-user-template-login\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.469361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txb5s\" (UniqueName: \"kubernetes.io/projected/095d9287-0f50-4bc9-b3d5-4ae2459ac6e9-kube-api-access-txb5s\") pod \"oauth-openshift-78c4665959-52m7f\" (UID: \"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9\") " pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.588760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.952124 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p"] Mar 10 15:11:16 crc kubenswrapper[4795]: I0310 15:11:16.952739 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" containerName="controller-manager" containerID="cri-o://9fa95e8f9014e4a6bb7883e99a8c76601ddf618e3ebfb0b5ea31aa8b36481d59" gracePeriod=30 Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.022625 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-78c4665959-52m7f"] Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.045165 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr"] Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.045461 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" containerName="route-controller-manager" containerID="cri-o://e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2" gracePeriod=30 Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.127197 4795 generic.go:334] "Generic (PLEG): container finished" podID="9cdf041e-d9fa-42ac-875f-c54406e67281" containerID="9fa95e8f9014e4a6bb7883e99a8c76601ddf618e3ebfb0b5ea31aa8b36481d59" exitCode=0 Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.127291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" event={"ID":"9cdf041e-d9fa-42ac-875f-c54406e67281","Type":"ContainerDied","Data":"9fa95e8f9014e4a6bb7883e99a8c76601ddf618e3ebfb0b5ea31aa8b36481d59"} Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.128813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" event={"ID":"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9","Type":"ContainerStarted","Data":"8dee62cd3d34556e491132d72ae6c0302236eb60478547101c86ceaa7374fa7b"} Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.489799 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="194b0dd6-bbc0-4b57-b94b-89e4e46d87fb" path="/var/lib/kubelet/pods/194b0dd6-bbc0-4b57-b94b-89e4e46d87fb/volumes" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.533332 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.568675 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert\") pod \"f883be8f-2258-47fe-b609-768f8b0a50c7\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.568734 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca\") pod \"f883be8f-2258-47fe-b609-768f8b0a50c7\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.568769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkhmh\" (UniqueName: \"kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh\") pod \"f883be8f-2258-47fe-b609-768f8b0a50c7\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.568820 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config\") pod \"f883be8f-2258-47fe-b609-768f8b0a50c7\" (UID: \"f883be8f-2258-47fe-b609-768f8b0a50c7\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.569733 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca" (OuterVolumeSpecName: "client-ca") pod "f883be8f-2258-47fe-b609-768f8b0a50c7" (UID: "f883be8f-2258-47fe-b609-768f8b0a50c7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.569779 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config" (OuterVolumeSpecName: "config") pod "f883be8f-2258-47fe-b609-768f8b0a50c7" (UID: "f883be8f-2258-47fe-b609-768f8b0a50c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.576248 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh" (OuterVolumeSpecName: "kube-api-access-tkhmh") pod "f883be8f-2258-47fe-b609-768f8b0a50c7" (UID: "f883be8f-2258-47fe-b609-768f8b0a50c7"). InnerVolumeSpecName "kube-api-access-tkhmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.578698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f883be8f-2258-47fe-b609-768f8b0a50c7" (UID: "f883be8f-2258-47fe-b609-768f8b0a50c7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.622223 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.669941 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert\") pod \"9cdf041e-d9fa-42ac-875f-c54406e67281\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.670347 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca\") pod \"9cdf041e-d9fa-42ac-875f-c54406e67281\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.670370 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgk76\" (UniqueName: \"kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76\") pod \"9cdf041e-d9fa-42ac-875f-c54406e67281\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.670393 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config\") pod \"9cdf041e-d9fa-42ac-875f-c54406e67281\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.670445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles\") pod \"9cdf041e-d9fa-42ac-875f-c54406e67281\" (UID: \"9cdf041e-d9fa-42ac-875f-c54406e67281\") " Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671152 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f883be8f-2258-47fe-b609-768f8b0a50c7-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671173 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671184 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkhmh\" (UniqueName: \"kubernetes.io/projected/f883be8f-2258-47fe-b609-768f8b0a50c7-kube-api-access-tkhmh\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671195 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f883be8f-2258-47fe-b609-768f8b0a50c7-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671203 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca" (OuterVolumeSpecName: "client-ca") pod "9cdf041e-d9fa-42ac-875f-c54406e67281" (UID: "9cdf041e-d9fa-42ac-875f-c54406e67281"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671269 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9cdf041e-d9fa-42ac-875f-c54406e67281" (UID: "9cdf041e-d9fa-42ac-875f-c54406e67281"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.671317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config" (OuterVolumeSpecName: "config") pod "9cdf041e-d9fa-42ac-875f-c54406e67281" (UID: "9cdf041e-d9fa-42ac-875f-c54406e67281"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.675173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76" (OuterVolumeSpecName: "kube-api-access-sgk76") pod "9cdf041e-d9fa-42ac-875f-c54406e67281" (UID: "9cdf041e-d9fa-42ac-875f-c54406e67281"). InnerVolumeSpecName "kube-api-access-sgk76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.675179 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9cdf041e-d9fa-42ac-875f-c54406e67281" (UID: "9cdf041e-d9fa-42ac-875f-c54406e67281"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.771753 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.771778 4795 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.771788 4795 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9cdf041e-d9fa-42ac-875f-c54406e67281-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.771797 4795 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9cdf041e-d9fa-42ac-875f-c54406e67281-client-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:17 crc kubenswrapper[4795]: I0310 15:11:17.771857 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgk76\" (UniqueName: \"kubernetes.io/projected/9cdf041e-d9fa-42ac-875f-c54406e67281-kube-api-access-sgk76\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.039600 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.039977 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" containerName="controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040066 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" containerName="controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.040157 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" containerName="route-controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040218 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" containerName="route-controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040355 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" containerName="route-controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040440 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" containerName="controller-manager" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040765 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.040956 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.041315 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636" gracePeriod=15 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.041336 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec" gracePeriod=15 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.041428 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07" gracePeriod=15 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.041530 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067" gracePeriod=15 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.041434 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0" gracePeriod=15 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.075196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.075262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.075391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.075429 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.075451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.079425 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.113596 4795 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116536 4795 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116824 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116851 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116868 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116895 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116905 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116921 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116931 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116945 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116955 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116969 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.116979 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.116996 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117005 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.117019 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117030 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.117051 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117062 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117358 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117377 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117388 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117400 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117428 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117440 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117455 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.117590 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117604 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.117820 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.134546 4795 generic.go:334] "Generic (PLEG): container finished" podID="f883be8f-2258-47fe-b609-768f8b0a50c7" containerID="e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2" exitCode=0 Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.134606 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.134640 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" event={"ID":"f883be8f-2258-47fe-b609-768f8b0a50c7","Type":"ContainerDied","Data":"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2"} Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.134681 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" event={"ID":"f883be8f-2258-47fe-b609-768f8b0a50c7","Type":"ContainerDied","Data":"34f056ae47097e565de221a5dd783d1c5744dff46aefce88a234caf06ea93387"} Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.134699 4795 scope.go:117] "RemoveContainer" containerID="e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.135357 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.135582 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.135805 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" event={"ID":"9cdf041e-d9fa-42ac-875f-c54406e67281","Type":"ContainerDied","Data":"0ef2babcd778479215f80f60b6fe50e3fa736d2c3baa09bd433c41399a82db29"} Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.135870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.135896 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.136409 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.136753 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.136988 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.137217 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.145709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" event={"ID":"095d9287-0f50-4bc9-b3d5-4ae2459ac6e9","Type":"ContainerStarted","Data":"d968168d3cee3971ff5f6727ecb182ac91d900f5b5dde4fa460c7255638ce089"} Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.146461 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.146768 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.147004 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.147198 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.147482 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.148261 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.152972 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.153506 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.154831 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.163028 4795 scope.go:117] "RemoveContainer" containerID="e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.163045 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.164497 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2\": container with ID starting with e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2 not found: ID does not exist" containerID="e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.164585 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2"} err="failed to get container status \"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2\": rpc error: code = NotFound desc = could not find container \"e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2\": container with ID starting with e886fab8a25e9ae69b512faf22f5176782e5f158e4dd478f5b2e6da28402ace2 not found: ID does not exist" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.164681 4795 scope.go:117] "RemoveContainer" containerID="9fa95e8f9014e4a6bb7883e99a8c76601ddf618e3ebfb0b5ea31aa8b36481d59" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.164573 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.167025 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.167358 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.167584 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.167858 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.168305 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.168509 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178589 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178648 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178834 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.178870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.179498 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.179619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.179783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.179984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.180013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.277768 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.278699 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.279159 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.279404 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.279654 4795 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279690 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279762 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279690 4795 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279807 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279865 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.279894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.280023 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="200ms" Mar 10 15:11:18 crc kubenswrapper[4795]: I0310 15:11:18.373205 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:11:18 crc kubenswrapper[4795]: W0310 15:11:18.389731 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1719920faea077c91c7b9cba39d7ef043bfcffb725e3924486a46f98d0df3013 WatchSource:0}: Error finding container 1719920faea077c91c7b9cba39d7ef043bfcffb725e3924486a46f98d0df3013: Status 404 returned error can't find the container with id 1719920faea077c91c7b9cba39d7ef043bfcffb725e3924486a46f98d0df3013 Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.392573 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b83835519214c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:11:18.39208686 +0000 UTC m=+311.557827758,LastTimestamp:2026-03-10 15:11:18.39208686 +0000 UTC m=+311.557827758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.480902 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="400ms" Mar 10 15:11:18 crc kubenswrapper[4795]: E0310 15:11:18.882436 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="800ms" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.155646 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb"} Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.155685 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1719920faea077c91c7b9cba39d7ef043bfcffb725e3924486a46f98d0df3013"} Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.156718 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.157192 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.157440 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.157748 4795 generic.go:334] "Generic (PLEG): container finished" podID="66d38c5d-4363-47e6-b417-89fef328eb00" containerID="1f4b307f05c204595682fc9ef18a5460240896c35add3a4e969ec37aa6941396" exitCode=0 Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.157771 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.157810 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"66d38c5d-4363-47e6-b417-89fef328eb00","Type":"ContainerDied","Data":"1f4b307f05c204595682fc9ef18a5460240896c35add3a4e969ec37aa6941396"} Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.158398 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.158934 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.159346 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.159535 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.159742 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.160016 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.160349 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.160793 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.162331 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.163353 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec" exitCode=0 Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.163383 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0" exitCode=0 Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.163390 4795 scope.go:117] "RemoveContainer" containerID="2eabb2117d1237d2977d3679a4c02f5f0c7807051d6bcb8704d414b69ce56b38" Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.163397 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067" exitCode=0 Mar 10 15:11:19 crc kubenswrapper[4795]: I0310 15:11:19.163409 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07" exitCode=2 Mar 10 15:11:19 crc kubenswrapper[4795]: E0310 15:11:19.683103 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="1.6s" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.176548 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.419444 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.423317 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.424565 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.425019 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.425496 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.425891 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.426151 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.426510 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.457372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.457958 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.458432 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.458695 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.458959 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.459257 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.459515 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511328 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511460 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511472 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511497 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511545 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access\") pod \"66d38c5d-4363-47e6-b417-89fef328eb00\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511630 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511663 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir\") pod \"66d38c5d-4363-47e6-b417-89fef328eb00\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock\") pod \"66d38c5d-4363-47e6-b417-89fef328eb00\" (UID: \"66d38c5d-4363-47e6-b417-89fef328eb00\") " Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "66d38c5d-4363-47e6-b417-89fef328eb00" (UID: "66d38c5d-4363-47e6-b417-89fef328eb00"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.511815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock" (OuterVolumeSpecName: "var-lock") pod "66d38c5d-4363-47e6-b417-89fef328eb00" (UID: "66d38c5d-4363-47e6-b417-89fef328eb00"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.512244 4795 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.512271 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.512293 4795 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.512310 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/66d38c5d-4363-47e6-b417-89fef328eb00-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.512327 4795 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.517935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "66d38c5d-4363-47e6-b417-89fef328eb00" (UID: "66d38c5d-4363-47e6-b417-89fef328eb00"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:11:20 crc kubenswrapper[4795]: I0310 15:11:20.613716 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/66d38c5d-4363-47e6-b417-89fef328eb00-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.192110 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.192130 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"66d38c5d-4363-47e6-b417-89fef328eb00","Type":"ContainerDied","Data":"83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41"} Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.192184 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83b7b03e604a811e65658c4bf2cf4559c0c4d4db1d4708a00694dafb089edf41" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.198246 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.199200 4795 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636" exitCode=0 Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.199273 4795 scope.go:117] "RemoveContainer" containerID="7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.199308 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.209267 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.210045 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.210728 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.211228 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.211573 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.211836 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.221518 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.221747 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.221959 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.222213 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.222430 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.222534 4795 scope.go:117] "RemoveContainer" containerID="d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.222648 4795 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.251050 4795 scope.go:117] "RemoveContainer" containerID="f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.267201 4795 scope.go:117] "RemoveContainer" containerID="766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.284569 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="3.2s" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.285868 4795 scope.go:117] "RemoveContainer" containerID="837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.307507 4795 scope.go:117] "RemoveContainer" containerID="e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.335756 4795 scope.go:117] "RemoveContainer" containerID="7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.337131 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\": container with ID starting with 7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec not found: ID does not exist" containerID="7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.337191 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec"} err="failed to get container status \"7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\": rpc error: code = NotFound desc = could not find container \"7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec\": container with ID starting with 7b0e4fb4d45d8fa3b40696f7379005f91b35c48e350e0f0be17530e81c0d8aec not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.337227 4795 scope.go:117] "RemoveContainer" containerID="d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.337835 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\": container with ID starting with d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0 not found: ID does not exist" containerID="d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.337910 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0"} err="failed to get container status \"d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\": rpc error: code = NotFound desc = could not find container \"d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0\": container with ID starting with d2d86dad600c9d09a6af9893fd1d3f4e212383548a0f84652cd656f30263b8e0 not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.337966 4795 scope.go:117] "RemoveContainer" containerID="f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.338713 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\": container with ID starting with f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067 not found: ID does not exist" containerID="f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.338744 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067"} err="failed to get container status \"f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\": rpc error: code = NotFound desc = could not find container \"f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067\": container with ID starting with f9b1d02dd80127512789ddfd3e1e2f1c1d497862f16be4691517923c7f688067 not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.338767 4795 scope.go:117] "RemoveContainer" containerID="766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.339123 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\": container with ID starting with 766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07 not found: ID does not exist" containerID="766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.339151 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07"} err="failed to get container status \"766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\": rpc error: code = NotFound desc = could not find container \"766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07\": container with ID starting with 766f0c215edf1f2981f9f7703fa4eeaa5d9ed9968a0d810bbec587082ad58d07 not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.339172 4795 scope.go:117] "RemoveContainer" containerID="837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.339390 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\": container with ID starting with 837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636 not found: ID does not exist" containerID="837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.339410 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636"} err="failed to get container status \"837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\": rpc error: code = NotFound desc = could not find container \"837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636\": container with ID starting with 837f1e2b526de688b78a794a77ab7fc4803314fb5900634ba82f79419f9d6636 not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.339423 4795 scope.go:117] "RemoveContainer" containerID="e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886" Mar 10 15:11:21 crc kubenswrapper[4795]: E0310 15:11:21.339655 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\": container with ID starting with e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886 not found: ID does not exist" containerID="e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.339677 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886"} err="failed to get container status \"e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\": rpc error: code = NotFound desc = could not find container \"e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886\": container with ID starting with e5b8a6506f51a94f0645e802e4da7e2788ceb8550a0e071d3c9aeb6d33cdd886 not found: ID does not exist" Mar 10 15:11:21 crc kubenswrapper[4795]: I0310 15:11:21.489338 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 10 15:11:23 crc kubenswrapper[4795]: E0310 15:11:23.345729 4795 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.32:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b83835519214c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-10 15:11:18.39208686 +0000 UTC m=+311.557827758,LastTimestamp:2026-03-10 15:11:18.39208686 +0000 UTC m=+311.557827758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 10 15:11:24 crc kubenswrapper[4795]: E0310 15:11:24.485734 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="6.4s" Mar 10 15:11:27 crc kubenswrapper[4795]: I0310 15:11:27.479027 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:27 crc kubenswrapper[4795]: I0310 15:11:27.479695 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:27 crc kubenswrapper[4795]: I0310 15:11:27.480558 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:27 crc kubenswrapper[4795]: I0310 15:11:27.481163 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:27 crc kubenswrapper[4795]: I0310 15:11:27.483262 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:27 crc kubenswrapper[4795]: E0310 15:11:27.579049 4795 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" volumeName="registry-storage" Mar 10 15:11:30 crc kubenswrapper[4795]: E0310 15:11:30.886924 4795 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.32:6443: connect: connection refused" interval="7s" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.274940 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.276930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.277004 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2" exitCode=1 Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.277059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2"} Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.277780 4795 scope.go:117] "RemoveContainer" containerID="40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.278476 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.278955 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.279561 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.280203 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.280687 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.281129 4795 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.476795 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.478186 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.478463 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.478775 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.479043 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.479366 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.479550 4795 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.524263 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.524300 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:31 crc kubenswrapper[4795]: E0310 15:11:31.524744 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:31 crc kubenswrapper[4795]: I0310 15:11:31.525175 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:31 crc kubenswrapper[4795]: W0310 15:11:31.555037 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-2026cc1538a3bb50ca4b1dae798f97a4af7ef7e8bc468e99faa5bbcefd36b29f WatchSource:0}: Error finding container 2026cc1538a3bb50ca4b1dae798f97a4af7ef7e8bc468e99faa5bbcefd36b29f: Status 404 returned error can't find the container with id 2026cc1538a3bb50ca4b1dae798f97a4af7ef7e8bc468e99faa5bbcefd36b29f Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.287773 4795 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="77f5bea37da96555d6f93711a834ebedb1542079cff8b079b859e7a20c208918" exitCode=0 Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.287852 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"77f5bea37da96555d6f93711a834ebedb1542079cff8b079b859e7a20c208918"} Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.288208 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2026cc1538a3bb50ca4b1dae798f97a4af7ef7e8bc468e99faa5bbcefd36b29f"} Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.288615 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.288638 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.289058 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: E0310 15:11:32.289100 4795 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.289502 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.289709 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.289936 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.290342 4795 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.290663 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.291283 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.293240 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.293309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5d0ae0f6d530c2a7ecaf4f52aab67489ac105f8a178670dbfb2a826efe89c93"} Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.294229 4795 status_manager.go:851] "Failed to get status for pod" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" pod="openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cf8c4b57b-t9j9p\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.294561 4795 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.294991 4795 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.295320 4795 status_manager.go:851] "Failed to get status for pod" podUID="095d9287-0f50-4bc9-b3d5-4ae2459ac6e9" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-78c4665959-52m7f\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.295787 4795 status_manager.go:851] "Failed to get status for pod" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:32 crc kubenswrapper[4795]: I0310 15:11:32.296246 4795 status_manager.go:851] "Failed to get status for pod" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" pod="openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-9bd6b545c-stvtr\": dial tcp 38.102.83.32:6443: connect: connection refused" Mar 10 15:11:33 crc kubenswrapper[4795]: I0310 15:11:33.210492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:33 crc kubenswrapper[4795]: I0310 15:11:33.302179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a4a805ce3484d0020b25de8f31b43e9c865f750b40f8b39b669ed3c25392b61a"} Mar 10 15:11:33 crc kubenswrapper[4795]: I0310 15:11:33.302224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7941eeaf747de637beadd398f10b5812d2f1f9f79478a66a4138a4a54ff7fd0e"} Mar 10 15:11:33 crc kubenswrapper[4795]: I0310 15:11:33.302237 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"3c89e058ee006b7eb94e513d022fad83daeaa71a8630d34f26cc3e6ffb0d42e9"} Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.062756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.063276 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.063313 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.312736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"713bedd3a6c73b432a83f69fb3f7978bdf249dd59530caade55a6f9e836e0737"} Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.313416 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9f75b9a86f27d5cb2c58cad2d3a3f6d0fe649514a8e1ca5473d7cf0418703a2e"} Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.313159 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:34 crc kubenswrapper[4795]: I0310 15:11:34.313714 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:36 crc kubenswrapper[4795]: I0310 15:11:36.525809 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:36 crc kubenswrapper[4795]: I0310 15:11:36.526149 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:36 crc kubenswrapper[4795]: I0310 15:11:36.532415 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.548122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.548242 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.551243 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.551251 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.561231 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.574793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.575652 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.649861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.649977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.656587 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.657540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.662655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.800736 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.820324 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 10 15:11:37 crc kubenswrapper[4795]: I0310 15:11:37.828281 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:11:38 crc kubenswrapper[4795]: W0310 15:11:38.250029 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-090a0ca441e3f51770a24755ff31bab25cc6023a01a5e9d68537ec9f139db134 WatchSource:0}: Error finding container 090a0ca441e3f51770a24755ff31bab25cc6023a01a5e9d68537ec9f139db134: Status 404 returned error can't find the container with id 090a0ca441e3f51770a24755ff31bab25cc6023a01a5e9d68537ec9f139db134 Mar 10 15:11:38 crc kubenswrapper[4795]: W0310 15:11:38.279250 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-537d42cd46d6f686e7e5a3304ee3840ec548ef9dbb8bd157f59a41b02d3267f2 WatchSource:0}: Error finding container 537d42cd46d6f686e7e5a3304ee3840ec548ef9dbb8bd157f59a41b02d3267f2: Status 404 returned error can't find the container with id 537d42cd46d6f686e7e5a3304ee3840ec548ef9dbb8bd157f59a41b02d3267f2 Mar 10 15:11:38 crc kubenswrapper[4795]: I0310 15:11:38.348924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"090a0ca441e3f51770a24755ff31bab25cc6023a01a5e9d68537ec9f139db134"} Mar 10 15:11:38 crc kubenswrapper[4795]: W0310 15:11:38.348959 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-67b18e034a093dd857d159462010d55db016ab25f0b403bae1c9bd7beacc4457 WatchSource:0}: Error finding container 67b18e034a093dd857d159462010d55db016ab25f0b403bae1c9bd7beacc4457: Status 404 returned error can't find the container with id 67b18e034a093dd857d159462010d55db016ab25f0b403bae1c9bd7beacc4457 Mar 10 15:11:38 crc kubenswrapper[4795]: I0310 15:11:38.350428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"537d42cd46d6f686e7e5a3304ee3840ec548ef9dbb8bd157f59a41b02d3267f2"} Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.324977 4795 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.356656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5db9cf1376b86549c188a10dea6b736c2424fd24a401b362d27c070a4662653f"} Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.358836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"370f356b4e96ae1d05409d6749094f7f3c972b22a8b7e7eba3b8cf116acc79bf"} Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.362921 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.362949 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.363798 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.363862 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.363877 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"67b18e034a093dd857d159462010d55db016ab25f0b403bae1c9bd7beacc4457"} Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.363895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0f515bd7c4c72ede70aa642efcbecf7f3831442ece7e901b7a24ac368a933e87"} Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.366879 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:39 crc kubenswrapper[4795]: I0310 15:11:39.462933 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7810aa1b-8383-4428-8ce6-5f7be86726ea" Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.372687 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.373248 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="5db9cf1376b86549c188a10dea6b736c2424fd24a401b362d27c070a4662653f" exitCode=255 Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.373407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"5db9cf1376b86549c188a10dea6b736c2424fd24a401b362d27c070a4662653f"} Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.373822 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.373848 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.374402 4795 scope.go:117] "RemoveContainer" containerID="5db9cf1376b86549c188a10dea6b736c2424fd24a401b362d27c070a4662653f" Mar 10 15:11:40 crc kubenswrapper[4795]: I0310 15:11:40.397561 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7810aa1b-8383-4428-8ce6-5f7be86726ea" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.383749 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.384854 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/0.log" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.384912 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2" exitCode=255 Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.385293 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.385312 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.385320 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2"} Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.385384 4795 scope.go:117] "RemoveContainer" containerID="5db9cf1376b86549c188a10dea6b736c2424fd24a401b362d27c070a4662653f" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.385891 4795 scope.go:117] "RemoveContainer" containerID="20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2" Mar 10 15:11:41 crc kubenswrapper[4795]: E0310 15:11:41.386587 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:11:41 crc kubenswrapper[4795]: I0310 15:11:41.409076 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="7810aa1b-8383-4428-8ce6-5f7be86726ea" Mar 10 15:11:42 crc kubenswrapper[4795]: I0310 15:11:42.391858 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 15:11:42 crc kubenswrapper[4795]: I0310 15:11:42.392427 4795 scope.go:117] "RemoveContainer" containerID="20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2" Mar 10 15:11:42 crc kubenswrapper[4795]: E0310 15:11:42.392628 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:11:44 crc kubenswrapper[4795]: I0310 15:11:44.062427 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 15:11:44 crc kubenswrapper[4795]: I0310 15:11:44.062495 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 15:11:49 crc kubenswrapper[4795]: I0310 15:11:49.516166 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 10 15:11:49 crc kubenswrapper[4795]: I0310 15:11:49.580381 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 10 15:11:49 crc kubenswrapper[4795]: I0310 15:11:49.919642 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 10 15:11:49 crc kubenswrapper[4795]: I0310 15:11:49.950569 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.192822 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.445497 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.577788 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.614475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.753896 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.791337 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 10 15:11:50 crc kubenswrapper[4795]: I0310 15:11:50.882994 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.067841 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.173860 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.204725 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.250490 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.377890 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.600798 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.727988 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.834556 4795 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.913262 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 10 15:11:51 crc kubenswrapper[4795]: I0310 15:11:51.957767 4795 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.088147 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.170403 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.214901 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.216558 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.225609 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.525521 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.586725 4795 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.589729 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=34.589710072 podStartE2EDuration="34.589710072s" podCreationTimestamp="2026-03-10 15:11:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:39.349606949 +0000 UTC m=+332.515347847" watchObservedRunningTime="2026-03-10 15:11:52.589710072 +0000 UTC m=+345.755451010" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.590544 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-78c4665959-52m7f" podStartSLOduration=63.590503185 podStartE2EDuration="1m3.590503185s" podCreationTimestamp="2026-03-10 15:10:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:39.370820099 +0000 UTC m=+332.536560997" watchObservedRunningTime="2026-03-10 15:11:52.590503185 +0000 UTC m=+345.756244123" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.594726 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-controller-manager/controller-manager-6cf8c4b57b-t9j9p","openshift-route-controller-manager/route-controller-manager-9bd6b545c-stvtr"] Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.594809 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-545665d4f8-wjzgw","openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv","openshift-kube-apiserver/kube-apiserver-crc"] Mar 10 15:11:52 crc kubenswrapper[4795]: E0310 15:11:52.595129 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" containerName="installer" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.595150 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" containerName="installer" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.595319 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d38c5d-4363-47e6-b417-89fef328eb00" containerName="installer" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.596025 4795 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.596061 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e6b1ac9a-c515-42e3-ae9d-b05a058df9bc" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.596521 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.597012 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.599542 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.600132 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.601006 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.601851 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.602256 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.603115 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.603548 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.604112 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.604435 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.604701 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.605060 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.606559 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.608171 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.609563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.611879 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.618296 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.650708 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=13.650677135 podStartE2EDuration="13.650677135s" podCreationTimestamp="2026-03-10 15:11:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:11:52.647684529 +0000 UTC m=+345.813425467" watchObservedRunningTime="2026-03-10 15:11:52.650677135 +0000 UTC m=+345.816418073" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.660617 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.699871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/200c579d-a88e-4a3c-a109-fe9177ce202b-serving-cert\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.699923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-config\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.699945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-client-ca\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.699965 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7bf693e-3e78-421e-9b0d-cf526e49cf17-serving-cert\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.700003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-client-ca\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.700114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngn9s\" (UniqueName: \"kubernetes.io/projected/200c579d-a88e-4a3c-a109-fe9177ce202b-kube-api-access-ngn9s\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.700231 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p27g\" (UniqueName: \"kubernetes.io/projected/b7bf693e-3e78-421e-9b0d-cf526e49cf17-kube-api-access-4p27g\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.700260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-proxy-ca-bundles\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.700290 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-config\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802052 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngn9s\" (UniqueName: \"kubernetes.io/projected/200c579d-a88e-4a3c-a109-fe9177ce202b-kube-api-access-ngn9s\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p27g\" (UniqueName: \"kubernetes.io/projected/b7bf693e-3e78-421e-9b0d-cf526e49cf17-kube-api-access-4p27g\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-proxy-ca-bundles\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802284 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-config\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/200c579d-a88e-4a3c-a109-fe9177ce202b-serving-cert\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-config\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-client-ca\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802419 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7bf693e-3e78-421e-9b0d-cf526e49cf17-serving-cert\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.802469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-client-ca\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.804580 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-client-ca\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.804796 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-client-ca\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.804928 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7bf693e-3e78-421e-9b0d-cf526e49cf17-config\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.804999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-proxy-ca-bundles\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.805484 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/200c579d-a88e-4a3c-a109-fe9177ce202b-config\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.813669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/200c579d-a88e-4a3c-a109-fe9177ce202b-serving-cert\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.814934 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b7bf693e-3e78-421e-9b0d-cf526e49cf17-serving-cert\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.825957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p27g\" (UniqueName: \"kubernetes.io/projected/b7bf693e-3e78-421e-9b0d-cf526e49cf17-kube-api-access-4p27g\") pod \"route-controller-manager-7d5456844b-46pfv\" (UID: \"b7bf693e-3e78-421e-9b0d-cf526e49cf17\") " pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.827858 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.839277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngn9s\" (UniqueName: \"kubernetes.io/projected/200c579d-a88e-4a3c-a109-fe9177ce202b-kube-api-access-ngn9s\") pod \"controller-manager-545665d4f8-wjzgw\" (UID: \"200c579d-a88e-4a3c-a109-fe9177ce202b\") " pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.848249 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.914432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.926474 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.930409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.945788 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:52 crc kubenswrapper[4795]: I0310 15:11:52.995317 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.094575 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.116650 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.188907 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.253139 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.266801 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.292713 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.417604 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.445237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.482589 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdf041e-d9fa-42ac-875f-c54406e67281" path="/var/lib/kubelet/pods/9cdf041e-d9fa-42ac-875f-c54406e67281/volumes" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.483467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f883be8f-2258-47fe-b609-768f8b0a50c7" path="/var/lib/kubelet/pods/f883be8f-2258-47fe-b609-768f8b0a50c7/volumes" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.568382 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.652190 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.868194 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.961140 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 10 15:11:53 crc kubenswrapper[4795]: I0310 15:11:53.968947 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.028781 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.062542 4795 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.062940 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.063248 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.064375 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c5d0ae0f6d530c2a7ecaf4f52aab67489ac105f8a178670dbfb2a826efe89c93"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.064758 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c5d0ae0f6d530c2a7ecaf4f52aab67489ac105f8a178670dbfb2a826efe89c93" gracePeriod=30 Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.125285 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.137185 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.141499 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.315449 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.339974 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.352952 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.383430 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.526184 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.624190 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.646875 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.733028 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.740366 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.748696 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.834057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 10 15:11:54 crc kubenswrapper[4795]: I0310 15:11:54.987239 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.059779 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.160548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.243522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.258673 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.329019 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.334772 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.354825 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.524807 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.549354 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.578273 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.595860 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.636257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.689688 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.692786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.723225 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.781561 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.890638 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 10 15:11:55 crc kubenswrapper[4795]: I0310 15:11:55.898757 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.188579 4795 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager_b7bf693e-3e78-421e-9b0d-cf526e49cf17_0(f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6): error adding pod openshift-route-controller-manager_route-controller-manager-7d5456844b-46pfv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6" Netns:"/var/run/netns/35ac8faf-52ce-493c-a9ac-1384c13462d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-7d5456844b-46pfv;K8S_POD_INFRA_CONTAINER_ID=f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6;K8S_POD_UID=b7bf693e-3e78-421e-9b0d-cf526e49cf17" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv] networking: Multus: [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv/b7bf693e-3e78-421e-9b0d-cf526e49cf17]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-7d5456844b-46pfv in out of cluster comm: pod "route-controller-manager-7d5456844b-46pfv" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.188654 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager_b7bf693e-3e78-421e-9b0d-cf526e49cf17_0(f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6): error adding pod openshift-route-controller-manager_route-controller-manager-7d5456844b-46pfv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6" Netns:"/var/run/netns/35ac8faf-52ce-493c-a9ac-1384c13462d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-7d5456844b-46pfv;K8S_POD_INFRA_CONTAINER_ID=f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6;K8S_POD_UID=b7bf693e-3e78-421e-9b0d-cf526e49cf17" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv] networking: Multus: [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv/b7bf693e-3e78-421e-9b0d-cf526e49cf17]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-7d5456844b-46pfv in out of cluster comm: pod "route-controller-manager-7d5456844b-46pfv" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.188678 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager_b7bf693e-3e78-421e-9b0d-cf526e49cf17_0(f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6): error adding pod openshift-route-controller-manager_route-controller-manager-7d5456844b-46pfv to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6" Netns:"/var/run/netns/35ac8faf-52ce-493c-a9ac-1384c13462d7" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-7d5456844b-46pfv;K8S_POD_INFRA_CONTAINER_ID=f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6;K8S_POD_UID=b7bf693e-3e78-421e-9b0d-cf526e49cf17" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv] networking: Multus: [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv/b7bf693e-3e78-421e-9b0d-cf526e49cf17]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-7d5456844b-46pfv in out of cluster comm: pod "route-controller-manager-7d5456844b-46pfv" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.188746 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager(b7bf693e-3e78-421e-9b0d-cf526e49cf17)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager(b7bf693e-3e78-421e-9b0d-cf526e49cf17)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-7d5456844b-46pfv_openshift-route-controller-manager_b7bf693e-3e78-421e-9b0d-cf526e49cf17_0(f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6): error adding pod openshift-route-controller-manager_route-controller-manager-7d5456844b-46pfv to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6\\\" Netns:\\\"/var/run/netns/35ac8faf-52ce-493c-a9ac-1384c13462d7\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-7d5456844b-46pfv;K8S_POD_INFRA_CONTAINER_ID=f96fcd02e1ba968df183e40fe973787d3fc660bcbd8f5081247b563985d521a6;K8S_POD_UID=b7bf693e-3e78-421e-9b0d-cf526e49cf17\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv] networking: Multus: [openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv/b7bf693e-3e78-421e-9b0d-cf526e49cf17]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod route-controller-manager-7d5456844b-46pfv in out of cluster comm: pod \\\"route-controller-manager-7d5456844b-46pfv\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" podUID="b7bf693e-3e78-421e-9b0d-cf526e49cf17" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.214765 4795 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-545665d4f8-wjzgw_openshift-controller-manager_200c579d-a88e-4a3c-a109-fe9177ce202b_0(a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27): error adding pod openshift-controller-manager_controller-manager-545665d4f8-wjzgw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27" Netns:"/var/run/netns/a6c4d617-8397-43e6-9cc1-3f183aee126d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-545665d4f8-wjzgw;K8S_POD_INFRA_CONTAINER_ID=a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27;K8S_POD_UID=200c579d-a88e-4a3c-a109-fe9177ce202b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-545665d4f8-wjzgw] networking: Multus: [openshift-controller-manager/controller-manager-545665d4f8-wjzgw/200c579d-a88e-4a3c-a109-fe9177ce202b]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-545665d4f8-wjzgw in out of cluster comm: pod "controller-manager-545665d4f8-wjzgw" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.215140 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-545665d4f8-wjzgw_openshift-controller-manager_200c579d-a88e-4a3c-a109-fe9177ce202b_0(a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27): error adding pod openshift-controller-manager_controller-manager-545665d4f8-wjzgw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27" Netns:"/var/run/netns/a6c4d617-8397-43e6-9cc1-3f183aee126d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-545665d4f8-wjzgw;K8S_POD_INFRA_CONTAINER_ID=a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27;K8S_POD_UID=200c579d-a88e-4a3c-a109-fe9177ce202b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-545665d4f8-wjzgw] networking: Multus: [openshift-controller-manager/controller-manager-545665d4f8-wjzgw/200c579d-a88e-4a3c-a109-fe9177ce202b]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-545665d4f8-wjzgw in out of cluster comm: pod "controller-manager-545665d4f8-wjzgw" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.215167 4795 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 10 15:11:56 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-545665d4f8-wjzgw_openshift-controller-manager_200c579d-a88e-4a3c-a109-fe9177ce202b_0(a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27): error adding pod openshift-controller-manager_controller-manager-545665d4f8-wjzgw to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27" Netns:"/var/run/netns/a6c4d617-8397-43e6-9cc1-3f183aee126d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-545665d4f8-wjzgw;K8S_POD_INFRA_CONTAINER_ID=a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27;K8S_POD_UID=200c579d-a88e-4a3c-a109-fe9177ce202b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-545665d4f8-wjzgw] networking: Multus: [openshift-controller-manager/controller-manager-545665d4f8-wjzgw/200c579d-a88e-4a3c-a109-fe9177ce202b]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-545665d4f8-wjzgw in out of cluster comm: pod "controller-manager-545665d4f8-wjzgw" not found Mar 10 15:11:56 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:11:56 crc kubenswrapper[4795]: > pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:11:56 crc kubenswrapper[4795]: E0310 15:11:56.215238 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-545665d4f8-wjzgw_openshift-controller-manager(200c579d-a88e-4a3c-a109-fe9177ce202b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-545665d4f8-wjzgw_openshift-controller-manager(200c579d-a88e-4a3c-a109-fe9177ce202b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-545665d4f8-wjzgw_openshift-controller-manager_200c579d-a88e-4a3c-a109-fe9177ce202b_0(a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27): error adding pod openshift-controller-manager_controller-manager-545665d4f8-wjzgw to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27\\\" Netns:\\\"/var/run/netns/a6c4d617-8397-43e6-9cc1-3f183aee126d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-545665d4f8-wjzgw;K8S_POD_INFRA_CONTAINER_ID=a3d6d12012e47ac84b1c5ab0c836d6a26e8b220a79f7d279b178d18d3c8b6d27;K8S_POD_UID=200c579d-a88e-4a3c-a109-fe9177ce202b\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-545665d4f8-wjzgw] networking: Multus: [openshift-controller-manager/controller-manager-545665d4f8-wjzgw/200c579d-a88e-4a3c-a109-fe9177ce202b]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod controller-manager-545665d4f8-wjzgw in out of cluster comm: pod \\\"controller-manager-545665d4f8-wjzgw\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" podUID="200c579d-a88e-4a3c-a109-fe9177ce202b" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.288855 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.344844 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.348343 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.353921 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.368705 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.429270 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.469342 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.476308 4795 scope.go:117] "RemoveContainer" containerID="20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.603209 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.619896 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.750924 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.835656 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 10 15:11:56 crc kubenswrapper[4795]: I0310 15:11:56.841724 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.153816 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.179574 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.186007 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.290558 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.366169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.377585 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.387295 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.448812 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.460190 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.488178 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.490172 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.490791 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/1.log" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.490947 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerDied","Data":"ecdb3b054cd8e5afe83c55ba4bc1fb22b061e7b63b5f6cef4770b027484238e2"} Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.490859 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d751cbb-f2e2-430d-9754-c882a5e924a5" containerID="ecdb3b054cd8e5afe83c55ba4bc1fb22b061e7b63b5f6cef4770b027484238e2" exitCode=255 Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.491005 4795 scope.go:117] "RemoveContainer" containerID="20c6ee4ccb6213011b3940b235c040daf85a3677c71b62008ae6c523980228a2" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.491524 4795 scope.go:117] "RemoveContainer" containerID="ecdb3b054cd8e5afe83c55ba4bc1fb22b061e7b63b5f6cef4770b027484238e2" Mar 10 15:11:57 crc kubenswrapper[4795]: E0310 15:11:57.491799 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.499757 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.530117 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.599922 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.611040 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.782794 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.795242 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 10 15:11:57 crc kubenswrapper[4795]: I0310 15:11:57.904750 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.022655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.052941 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.063973 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.158685 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.161624 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.176348 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.183163 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.253720 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.292756 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.474773 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.495500 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.498004 4795 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.502469 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.536000 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.597912 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.598422 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.802865 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.826696 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.847169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.876773 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.919133 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 10 15:11:58 crc kubenswrapper[4795]: I0310 15:11:58.954415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.004818 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.073252 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.074941 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.096349 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.148435 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.157161 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.225405 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.370823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.401560 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.442574 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.480213 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.491628 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.507421 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.507460 4795 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.589910 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.599574 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.637577 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.722327 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.776725 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.785753 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.789432 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.900701 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.954361 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.989877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.990455 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 10 15:11:59 crc kubenswrapper[4795]: I0310 15:11:59.993954 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.006118 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.017853 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.092900 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.119520 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.178506 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.180257 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.183602 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.329199 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.368024 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.470661 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.476761 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.542389 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.609402 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.622375 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.728876 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.774004 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.774613 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.782501 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.962732 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 10 15:12:00 crc kubenswrapper[4795]: I0310 15:12:00.979743 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.117150 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.122860 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.164606 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.206369 4795 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.227972 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.265963 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.372984 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.407185 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.439675 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.465740 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.468597 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.482395 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.539905 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.542862 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.708758 4795 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.709008 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb" gracePeriod=5 Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.796830 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.846804 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.888641 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.921452 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 10 15:12:01 crc kubenswrapper[4795]: I0310 15:12:01.937646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.176617 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.293731 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.484236 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.597262 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.606028 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.623164 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.676655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.818263 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.902171 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 10 15:12:02 crc kubenswrapper[4795]: I0310 15:12:02.903009 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.006658 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.106397 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.157340 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.179273 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.346857 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.358057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.361864 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.529827 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.610146 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.670551 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.781228 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.944698 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 10 15:12:03 crc kubenswrapper[4795]: I0310 15:12:03.978500 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.080577 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.165936 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.426960 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.456328 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.514809 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.678784 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.727735 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.835740 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.858601 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.940190 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 10 15:12:04 crc kubenswrapper[4795]: I0310 15:12:04.959877 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 10 15:12:05 crc kubenswrapper[4795]: I0310 15:12:05.664563 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.828275 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.828361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891880 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891954 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891960 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891975 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.891996 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.892058 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.892291 4795 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.892332 4795 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.892357 4795 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.892378 4795 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.900308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:12:06 crc kubenswrapper[4795]: I0310 15:12:06.992679 4795 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.331982 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.459800 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.478520 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.487348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.490148 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.490626 4795 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.526235 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.526285 4795 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3379f31d-9b57-43fc-abe9-8f9b63d02366" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.529083 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.529161 4795 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3379f31d-9b57-43fc-abe9-8f9b63d02366" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.572797 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.572851 4795 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb" exitCode=137 Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.572899 4795 scope.go:117] "RemoveContainer" containerID="06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.572933 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.591464 4795 scope.go:117] "RemoveContainer" containerID="06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb" Mar 10 15:12:07 crc kubenswrapper[4795]: E0310 15:12:07.593011 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb\": container with ID starting with 06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb not found: ID does not exist" containerID="06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.593051 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb"} err="failed to get container status \"06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb\": rpc error: code = NotFound desc = could not find container \"06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb\": container with ID starting with 06a261e169abbac4b5c25b195fb1beaa9d118d12cb939f33c546ba1d4ed574eb not found: ID does not exist" Mar 10 15:12:07 crc kubenswrapper[4795]: I0310 15:12:07.962457 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-545665d4f8-wjzgw"] Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.475750 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.476131 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.581184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" event={"ID":"200c579d-a88e-4a3c-a109-fe9177ce202b","Type":"ContainerStarted","Data":"6a2702798c4b0414954602a154b4831f71f951c5adc8dcb95e0e55d869e102aa"} Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.581407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" event={"ID":"200c579d-a88e-4a3c-a109-fe9177ce202b","Type":"ContainerStarted","Data":"8c1a66aeef152fe2a63ea7615803b847008e4ee921ee63442eb1da437c5c7201"} Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.581421 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.589253 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.600054 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-545665d4f8-wjzgw" podStartSLOduration=52.600036669 podStartE2EDuration="52.600036669s" podCreationTimestamp="2026-03-10 15:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:12:08.5997213 +0000 UTC m=+361.765462198" watchObservedRunningTime="2026-03-10 15:12:08.600036669 +0000 UTC m=+361.765777567" Mar 10 15:12:08 crc kubenswrapper[4795]: I0310 15:12:08.867353 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv"] Mar 10 15:12:08 crc kubenswrapper[4795]: W0310 15:12:08.872011 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7bf693e_3e78_421e_9b0d_cf526e49cf17.slice/crio-b3bd319346c68ecfa85c62a198be71a1e1bbe7d4dbeca68e6648f63381669573 WatchSource:0}: Error finding container b3bd319346c68ecfa85c62a198be71a1e1bbe7d4dbeca68e6648f63381669573: Status 404 returned error can't find the container with id b3bd319346c68ecfa85c62a198be71a1e1bbe7d4dbeca68e6648f63381669573 Mar 10 15:12:09 crc kubenswrapper[4795]: I0310 15:12:09.590411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" event={"ID":"b7bf693e-3e78-421e-9b0d-cf526e49cf17","Type":"ContainerStarted","Data":"28ebae8a84e111c5d09a83583f34c840ee9f6e9487971cb139b1df8cf842e5a5"} Mar 10 15:12:09 crc kubenswrapper[4795]: I0310 15:12:09.590717 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:12:09 crc kubenswrapper[4795]: I0310 15:12:09.590729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" event={"ID":"b7bf693e-3e78-421e-9b0d-cf526e49cf17","Type":"ContainerStarted","Data":"b3bd319346c68ecfa85c62a198be71a1e1bbe7d4dbeca68e6648f63381669573"} Mar 10 15:12:09 crc kubenswrapper[4795]: I0310 15:12:09.596277 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" Mar 10 15:12:09 crc kubenswrapper[4795]: I0310 15:12:09.609327 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d5456844b-46pfv" podStartSLOduration=52.609312874 podStartE2EDuration="52.609312874s" podCreationTimestamp="2026-03-10 15:11:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:12:09.606180204 +0000 UTC m=+362.771921102" watchObservedRunningTime="2026-03-10 15:12:09.609312874 +0000 UTC m=+362.775053792" Mar 10 15:12:10 crc kubenswrapper[4795]: I0310 15:12:10.476563 4795 scope.go:117] "RemoveContainer" containerID="ecdb3b054cd8e5afe83c55ba4bc1fb22b061e7b63b5f6cef4770b027484238e2" Mar 10 15:12:10 crc kubenswrapper[4795]: E0310 15:12:10.477010 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=check-endpoints pod=network-check-source-55646444c4-trplf_openshift-network-diagnostics(9d751cbb-f2e2-430d-9754-c882a5e924a5)\"" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.653829 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552592-zf6h7"] Mar 10 15:12:13 crc kubenswrapper[4795]: E0310 15:12:13.654154 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.654169 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.654280 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.654724 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.658198 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.658300 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.662238 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.677294 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-zf6h7"] Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.782523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8pjv\" (UniqueName: \"kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv\") pod \"auto-csr-approver-29552592-zf6h7\" (UID: \"0c293ad9-88b5-420b-b70f-6122d9a29d5b\") " pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.883552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8pjv\" (UniqueName: \"kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv\") pod \"auto-csr-approver-29552592-zf6h7\" (UID: \"0c293ad9-88b5-420b-b70f-6122d9a29d5b\") " pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.900801 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8pjv\" (UniqueName: \"kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv\") pod \"auto-csr-approver-29552592-zf6h7\" (UID: \"0c293ad9-88b5-420b-b70f-6122d9a29d5b\") " pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:13 crc kubenswrapper[4795]: I0310 15:12:13.972582 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:14 crc kubenswrapper[4795]: I0310 15:12:14.456827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-zf6h7"] Mar 10 15:12:14 crc kubenswrapper[4795]: I0310 15:12:14.619109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" event={"ID":"0c293ad9-88b5-420b-b70f-6122d9a29d5b","Type":"ContainerStarted","Data":"eab25dedd1009eb323d4c3f6eaf702772dea9e716df1f36b63bf16d37c337478"} Mar 10 15:12:15 crc kubenswrapper[4795]: I0310 15:12:15.628591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" event={"ID":"0c293ad9-88b5-420b-b70f-6122d9a29d5b","Type":"ContainerStarted","Data":"eac3adcd0deb9e0f6270e8f2ec94f4e0e58a7dbc91f979986150d1423d88175a"} Mar 10 15:12:15 crc kubenswrapper[4795]: I0310 15:12:15.646130 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" podStartSLOduration=1.742631452 podStartE2EDuration="2.646112795s" podCreationTimestamp="2026-03-10 15:12:13 +0000 UTC" firstStartedPulling="2026-03-10 15:12:14.461245429 +0000 UTC m=+367.626986327" lastFinishedPulling="2026-03-10 15:12:15.364726752 +0000 UTC m=+368.530467670" observedRunningTime="2026-03-10 15:12:15.643925172 +0000 UTC m=+368.809666090" watchObservedRunningTime="2026-03-10 15:12:15.646112795 +0000 UTC m=+368.811853693" Mar 10 15:12:16 crc kubenswrapper[4795]: I0310 15:12:16.636354 4795 generic.go:334] "Generic (PLEG): container finished" podID="0c293ad9-88b5-420b-b70f-6122d9a29d5b" containerID="eac3adcd0deb9e0f6270e8f2ec94f4e0e58a7dbc91f979986150d1423d88175a" exitCode=0 Mar 10 15:12:16 crc kubenswrapper[4795]: I0310 15:12:16.636400 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" event={"ID":"0c293ad9-88b5-420b-b70f-6122d9a29d5b","Type":"ContainerDied","Data":"eac3adcd0deb9e0f6270e8f2ec94f4e0e58a7dbc91f979986150d1423d88175a"} Mar 10 15:12:17 crc kubenswrapper[4795]: I0310 15:12:17.834794 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 10 15:12:17 crc kubenswrapper[4795]: I0310 15:12:17.992340 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.137606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8pjv\" (UniqueName: \"kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv\") pod \"0c293ad9-88b5-420b-b70f-6122d9a29d5b\" (UID: \"0c293ad9-88b5-420b-b70f-6122d9a29d5b\") " Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.145592 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv" (OuterVolumeSpecName: "kube-api-access-r8pjv") pod "0c293ad9-88b5-420b-b70f-6122d9a29d5b" (UID: "0c293ad9-88b5-420b-b70f-6122d9a29d5b"). InnerVolumeSpecName "kube-api-access-r8pjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.239745 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8pjv\" (UniqueName: \"kubernetes.io/projected/0c293ad9-88b5-420b-b70f-6122d9a29d5b-kube-api-access-r8pjv\") on node \"crc\" DevicePath \"\"" Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.652468 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" event={"ID":"0c293ad9-88b5-420b-b70f-6122d9a29d5b","Type":"ContainerDied","Data":"eab25dedd1009eb323d4c3f6eaf702772dea9e716df1f36b63bf16d37c337478"} Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.652850 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eab25dedd1009eb323d4c3f6eaf702772dea9e716df1f36b63bf16d37c337478" Mar 10 15:12:18 crc kubenswrapper[4795]: I0310 15:12:18.652528 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552592-zf6h7" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.476679 4795 scope.go:117] "RemoveContainer" containerID="ecdb3b054cd8e5afe83c55ba4bc1fb22b061e7b63b5f6cef4770b027484238e2" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.698457 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-55646444c4-trplf_9d751cbb-f2e2-430d-9754-c882a5e924a5/check-endpoints/2.log" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.698572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a5380c5a731d5ac73e478586b3f8b7d2845e32305efba4fbbd44c353b4e562b0"} Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.702027 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.702591 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.704426 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.704478 4795 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c5d0ae0f6d530c2a7ecaf4f52aab67489ac105f8a178670dbfb2a826efe89c93" exitCode=137 Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.704515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c5d0ae0f6d530c2a7ecaf4f52aab67489ac105f8a178670dbfb2a826efe89c93"} Mar 10 15:12:24 crc kubenswrapper[4795]: I0310 15:12:24.704561 4795 scope.go:117] "RemoveContainer" containerID="40f6ab261bdfdeec095c160a5dd4ac9279f8dfd592fb40f020cc8b3eb2038cd2" Mar 10 15:12:25 crc kubenswrapper[4795]: I0310 15:12:25.710850 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 10 15:12:25 crc kubenswrapper[4795]: I0310 15:12:25.711701 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 10 15:12:25 crc kubenswrapper[4795]: I0310 15:12:25.712275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9388c4eb0f12bbd7fff47be1bd04dc3d683b9a80e58af3493cab690cfba45796"} Mar 10 15:12:27 crc kubenswrapper[4795]: I0310 15:12:27.727571 4795 generic.go:334] "Generic (PLEG): container finished" podID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerID="7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b" exitCode=0 Mar 10 15:12:27 crc kubenswrapper[4795]: I0310 15:12:27.727703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerDied","Data":"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b"} Mar 10 15:12:27 crc kubenswrapper[4795]: I0310 15:12:27.728720 4795 scope.go:117] "RemoveContainer" containerID="7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b" Mar 10 15:12:28 crc kubenswrapper[4795]: I0310 15:12:28.738620 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerStarted","Data":"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e"} Mar 10 15:12:28 crc kubenswrapper[4795]: I0310 15:12:28.738982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:12:28 crc kubenswrapper[4795]: I0310 15:12:28.741623 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:12:33 crc kubenswrapper[4795]: I0310 15:12:33.210795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:12:34 crc kubenswrapper[4795]: I0310 15:12:34.062671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:12:34 crc kubenswrapper[4795]: I0310 15:12:34.069377 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:12:34 crc kubenswrapper[4795]: I0310 15:12:34.778003 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 10 15:13:18 crc kubenswrapper[4795]: I0310 15:13:18.539482 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:18 crc kubenswrapper[4795]: I0310 15:13:18.542006 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.264224 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9h4n2"] Mar 10 15:13:38 crc kubenswrapper[4795]: E0310 15:13:38.266498 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c293ad9-88b5-420b-b70f-6122d9a29d5b" containerName="oc" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.266657 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c293ad9-88b5-420b-b70f-6122d9a29d5b" containerName="oc" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.266971 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c293ad9-88b5-420b-b70f-6122d9a29d5b" containerName="oc" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.267815 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.287498 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9h4n2"] Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-trusted-ca\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-certificates\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456741 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-bound-sa-token\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b264090e-ac7f-4e67-bd24-aeb17314ff37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh7zz\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-kube-api-access-rh7zz\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456860 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b264090e-ac7f-4e67-bd24-aeb17314ff37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.456922 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-tls\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.487276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.557971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-tls\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-trusted-ca\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-certificates\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-bound-sa-token\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558686 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b264090e-ac7f-4e67-bd24-aeb17314ff37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rh7zz\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-kube-api-access-rh7zz\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.558869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b264090e-ac7f-4e67-bd24-aeb17314ff37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.559322 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b264090e-ac7f-4e67-bd24-aeb17314ff37-ca-trust-extracted\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.559966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-certificates\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.560171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b264090e-ac7f-4e67-bd24-aeb17314ff37-trusted-ca\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.564099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b264090e-ac7f-4e67-bd24-aeb17314ff37-installation-pull-secrets\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.564949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-registry-tls\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.581097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh7zz\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-kube-api-access-rh7zz\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.585810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b264090e-ac7f-4e67-bd24-aeb17314ff37-bound-sa-token\") pod \"image-registry-66df7c8f76-9h4n2\" (UID: \"b264090e-ac7f-4e67-bd24-aeb17314ff37\") " pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:38 crc kubenswrapper[4795]: I0310 15:13:38.883331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:39 crc kubenswrapper[4795]: I0310 15:13:39.491957 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-9h4n2"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.195530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" event={"ID":"b264090e-ac7f-4e67-bd24-aeb17314ff37","Type":"ContainerStarted","Data":"5f7c7522aa567f8d0f323b0de31ebe4926033bad33f17f50c894bfc14efbd066"} Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.196108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" event={"ID":"b264090e-ac7f-4e67-bd24-aeb17314ff37","Type":"ContainerStarted","Data":"34e5b1d5b6aeb22f70c69a06df0715c3bc745b9fa0d2ea5a69bae3738f60889c"} Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.196127 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.215883 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" podStartSLOduration=2.215866965 podStartE2EDuration="2.215866965s" podCreationTimestamp="2026-03-10 15:13:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:13:40.213084845 +0000 UTC m=+453.378825743" watchObservedRunningTime="2026-03-10 15:13:40.215866965 +0000 UTC m=+453.381607863" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.402607 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.402904 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kldkn" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="registry-server" containerID="cri-o://d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27" gracePeriod=30 Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.411292 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.411533 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nnfqt" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="registry-server" containerID="cri-o://a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d" gracePeriod=30 Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.419205 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.419404 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" containerID="cri-o://85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e" gracePeriod=30 Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.427649 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.427914 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9fcm8" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="registry-server" containerID="cri-o://43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3" gracePeriod=30 Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.432933 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.433149 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rltg9" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="registry-server" containerID="cri-o://782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7" gracePeriod=30 Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.447764 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n48hs"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.448376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.456828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n48hs"] Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.583755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nc6\" (UniqueName: \"kubernetes.io/projected/e285d423-5d70-4e87-aed9-13bf768889ec-kube-api-access-v8nc6\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.583814 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.583886 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.685401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nc6\" (UniqueName: \"kubernetes.io/projected/e285d423-5d70-4e87-aed9-13bf768889ec-kube-api-access-v8nc6\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.685450 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.685501 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.686889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.694754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e285d423-5d70-4e87-aed9-13bf768889ec-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.702683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nc6\" (UniqueName: \"kubernetes.io/projected/e285d423-5d70-4e87-aed9-13bf768889ec-kube-api-access-v8nc6\") pod \"marketplace-operator-79b997595-n48hs\" (UID: \"e285d423-5d70-4e87-aed9-13bf768889ec\") " pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.761474 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.872996 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.875351 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.908729 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.965410 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content\") pod \"33c07b9b-efc7-4610-85d2-21e44611aa32\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991188 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics\") pod \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities\") pod \"33c07b9b-efc7-4610-85d2-21e44611aa32\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cs6d6\" (UniqueName: \"kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6\") pod \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991269 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca\") pod \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\" (UID: \"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.991285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k45vl\" (UniqueName: \"kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl\") pod \"33c07b9b-efc7-4610-85d2-21e44611aa32\" (UID: \"33c07b9b-efc7-4610-85d2-21e44611aa32\") " Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.992588 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities" (OuterVolumeSpecName: "utilities") pod "33c07b9b-efc7-4610-85d2-21e44611aa32" (UID: "33c07b9b-efc7-4610-85d2-21e44611aa32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.995917 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" (UID: "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.996045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl" (OuterVolumeSpecName: "kube-api-access-k45vl") pod "33c07b9b-efc7-4610-85d2-21e44611aa32" (UID: "33c07b9b-efc7-4610-85d2-21e44611aa32"). InnerVolumeSpecName "kube-api-access-k45vl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.997949 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" (UID: "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:13:40 crc kubenswrapper[4795]: I0310 15:13:40.998690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6" (OuterVolumeSpecName: "kube-api-access-cs6d6") pod "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" (UID: "b06a76ed-56f2-47d3-a6f0-3f1f889e77a9"). InnerVolumeSpecName "kube-api-access-cs6d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.050990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33c07b9b-efc7-4610-85d2-21e44611aa32" (UID: "33c07b9b-efc7-4610-85d2-21e44611aa32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092417 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content\") pod \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092485 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities\") pod \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content\") pod \"ae045732-f556-4808-bcd3-114aed4f8414\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xrwm\" (UniqueName: \"kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm\") pod \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\" (UID: \"9ac91d8b-c0bc-4758-91d3-9fa275d88e02\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092599 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities\") pod \"ae045732-f556-4808-bcd3-114aed4f8414\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092634 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w29bh\" (UniqueName: \"kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh\") pod \"ae045732-f556-4808-bcd3-114aed4f8414\" (UID: \"ae045732-f556-4808-bcd3-114aed4f8414\") " Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092823 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092833 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092844 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33c07b9b-efc7-4610-85d2-21e44611aa32-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092853 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cs6d6\" (UniqueName: \"kubernetes.io/projected/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-kube-api-access-cs6d6\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092861 4795 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.092868 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k45vl\" (UniqueName: \"kubernetes.io/projected/33c07b9b-efc7-4610-85d2-21e44611aa32-kube-api-access-k45vl\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.093669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities" (OuterVolumeSpecName: "utilities") pod "ae045732-f556-4808-bcd3-114aed4f8414" (UID: "ae045732-f556-4808-bcd3-114aed4f8414"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.094158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities" (OuterVolumeSpecName: "utilities") pod "9ac91d8b-c0bc-4758-91d3-9fa275d88e02" (UID: "9ac91d8b-c0bc-4758-91d3-9fa275d88e02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.095258 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm" (OuterVolumeSpecName: "kube-api-access-8xrwm") pod "9ac91d8b-c0bc-4758-91d3-9fa275d88e02" (UID: "9ac91d8b-c0bc-4758-91d3-9fa275d88e02"). InnerVolumeSpecName "kube-api-access-8xrwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.095405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh" (OuterVolumeSpecName: "kube-api-access-w29bh") pod "ae045732-f556-4808-bcd3-114aed4f8414" (UID: "ae045732-f556-4808-bcd3-114aed4f8414"). InnerVolumeSpecName "kube-api-access-w29bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.121837 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae045732-f556-4808-bcd3-114aed4f8414" (UID: "ae045732-f556-4808-bcd3-114aed4f8414"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.164594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ac91d8b-c0bc-4758-91d3-9fa275d88e02" (UID: "9ac91d8b-c0bc-4758-91d3-9fa275d88e02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193618 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w29bh\" (UniqueName: \"kubernetes.io/projected/ae045732-f556-4808-bcd3-114aed4f8414-kube-api-access-w29bh\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193648 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193657 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193665 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193673 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xrwm\" (UniqueName: \"kubernetes.io/projected/9ac91d8b-c0bc-4758-91d3-9fa275d88e02-kube-api-access-8xrwm\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.193682 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae045732-f556-4808-bcd3-114aed4f8414-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.201447 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n48hs"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.202842 4795 generic.go:334] "Generic (PLEG): container finished" podID="ae045732-f556-4808-bcd3-114aed4f8414" containerID="43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3" exitCode=0 Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.202882 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fcm8" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.202909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerDied","Data":"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.202940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fcm8" event={"ID":"ae045732-f556-4808-bcd3-114aed4f8414","Type":"ContainerDied","Data":"d5acd2b5f7207fb31aac14d3fa256298b93479ff0d6c54300d834dc404a54d58"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.202954 4795 scope.go:117] "RemoveContainer" containerID="43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.205427 4795 generic.go:334] "Generic (PLEG): container finished" podID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerID="d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27" exitCode=0 Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.205469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerDied","Data":"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.205488 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kldkn" event={"ID":"9ac91d8b-c0bc-4758-91d3-9fa275d88e02","Type":"ContainerDied","Data":"7ae2c6e228797d6b9f29ed0a401d6c6005d2852220fe01214cd195fbc946466c"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.205530 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kldkn" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.211878 4795 generic.go:334] "Generic (PLEG): container finished" podID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerID="a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d" exitCode=0 Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.211995 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nnfqt" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.212127 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerDied","Data":"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.212161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nnfqt" event={"ID":"33c07b9b-efc7-4610-85d2-21e44611aa32","Type":"ContainerDied","Data":"8a0171baa8e5a64e083a88d4b6997b41dab73898173d2e953b6e5ca065963d60"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.216741 4795 generic.go:334] "Generic (PLEG): container finished" podID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerID="85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e" exitCode=0 Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.216778 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.216804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerDied","Data":"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.216839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sqw59" event={"ID":"b06a76ed-56f2-47d3-a6f0-3f1f889e77a9","Type":"ContainerDied","Data":"9343ea30c764d2147fedb9d0ef2907d3e46be8d61997c70e23524d6b862821a3"} Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.220243 4795 scope.go:117] "RemoveContainer" containerID="a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.231017 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.240641 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kldkn"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.252862 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.260742 4795 scope.go:117] "RemoveContainer" containerID="6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.264831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sqw59"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.268849 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.273083 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nnfqt"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.287734 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.288280 4795 scope.go:117] "RemoveContainer" containerID="43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.289500 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fcm8"] Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.290525 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3\": container with ID starting with 43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3 not found: ID does not exist" containerID="43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.290564 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3"} err="failed to get container status \"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3\": rpc error: code = NotFound desc = could not find container \"43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3\": container with ID starting with 43435ba62934de950a2820da54d8a0979520c29cfd9e450c2cdb9e7266026cf3 not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.290595 4795 scope.go:117] "RemoveContainer" containerID="a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.294365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d\": container with ID starting with a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d not found: ID does not exist" containerID="a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.294419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d"} err="failed to get container status \"a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d\": rpc error: code = NotFound desc = could not find container \"a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d\": container with ID starting with a555d6946f5b6d247fa18f8718a1ba808410750b721dfd9534531444557f769d not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.294443 4795 scope.go:117] "RemoveContainer" containerID="6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.294702 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4\": container with ID starting with 6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4 not found: ID does not exist" containerID="6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.294735 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4"} err="failed to get container status \"6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4\": rpc error: code = NotFound desc = could not find container \"6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4\": container with ID starting with 6652012da9af4ce7d3b607bd0626aeefc26184cb78b499d99aff5905554d01c4 not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.294758 4795 scope.go:117] "RemoveContainer" containerID="d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.306422 4795 scope.go:117] "RemoveContainer" containerID="b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.328934 4795 scope.go:117] "RemoveContainer" containerID="a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.349799 4795 scope.go:117] "RemoveContainer" containerID="d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.350317 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27\": container with ID starting with d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27 not found: ID does not exist" containerID="d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.350342 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27"} err="failed to get container status \"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27\": rpc error: code = NotFound desc = could not find container \"d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27\": container with ID starting with d9df85c09450cf0ebc0649b2cfbae632843d6646c7ca1603cc736fb0bee7af27 not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.350362 4795 scope.go:117] "RemoveContainer" containerID="b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.350739 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2\": container with ID starting with b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2 not found: ID does not exist" containerID="b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.350780 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2"} err="failed to get container status \"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2\": rpc error: code = NotFound desc = could not find container \"b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2\": container with ID starting with b3fc19ddfa6bd257e979f2764997e128a297323676ac927007c355c1c1f635d2 not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.350831 4795 scope.go:117] "RemoveContainer" containerID="a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.351244 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf\": container with ID starting with a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf not found: ID does not exist" containerID="a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.351268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf"} err="failed to get container status \"a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf\": rpc error: code = NotFound desc = could not find container \"a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf\": container with ID starting with a86ad1d2421dd3da36b143fe595816368e200d0090cd073ac6ceb4bb2020bcbf not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.351282 4795 scope.go:117] "RemoveContainer" containerID="a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.423462 4795 scope.go:117] "RemoveContainer" containerID="16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.438802 4795 scope.go:117] "RemoveContainer" containerID="bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.450560 4795 scope.go:117] "RemoveContainer" containerID="a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.450907 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d\": container with ID starting with a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d not found: ID does not exist" containerID="a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.450940 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d"} err="failed to get container status \"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d\": rpc error: code = NotFound desc = could not find container \"a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d\": container with ID starting with a062cef9480ad734c2de06adcee59bf53ed329cd443c4107ac6226ec1511fc1d not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.450959 4795 scope.go:117] "RemoveContainer" containerID="16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.451408 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee\": container with ID starting with 16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee not found: ID does not exist" containerID="16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.451432 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee"} err="failed to get container status \"16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee\": rpc error: code = NotFound desc = could not find container \"16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee\": container with ID starting with 16356812bcc89ba6a70f743a161dd3229b50d94957d6be6b8767c78dfca5daee not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.451449 4795 scope.go:117] "RemoveContainer" containerID="bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.451794 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61\": container with ID starting with bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61 not found: ID does not exist" containerID="bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.451818 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61"} err="failed to get container status \"bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61\": rpc error: code = NotFound desc = could not find container \"bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61\": container with ID starting with bdf631e29302b11d19d668c30b5e2c31e5d2d2aa238087c7c8dbd4a895fd0b61 not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.451833 4795 scope.go:117] "RemoveContainer" containerID="85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.463949 4795 scope.go:117] "RemoveContainer" containerID="7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.479637 4795 scope.go:117] "RemoveContainer" containerID="85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.479998 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e\": container with ID starting with 85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e not found: ID does not exist" containerID="85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.480038 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e"} err="failed to get container status \"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e\": rpc error: code = NotFound desc = could not find container \"85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e\": container with ID starting with 85539d5cea8abcd758ba271417b923af3227642b8a6e2fce1c5a7ff4774a738e not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.480093 4795 scope.go:117] "RemoveContainer" containerID="7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b" Mar 10 15:13:41 crc kubenswrapper[4795]: E0310 15:13:41.480385 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b\": container with ID starting with 7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b not found: ID does not exist" containerID="7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.480415 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b"} err="failed to get container status \"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b\": rpc error: code = NotFound desc = could not find container \"7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b\": container with ID starting with 7ebb8e4be0cba14a9eea8679b890abd0ba4f1ee8afef2b9ab5ec8328e2ed646b not found: ID does not exist" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.483583 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" path="/var/lib/kubelet/pods/33c07b9b-efc7-4610-85d2-21e44611aa32/volumes" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.484644 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" path="/var/lib/kubelet/pods/9ac91d8b-c0bc-4758-91d3-9fa275d88e02/volumes" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.485370 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae045732-f556-4808-bcd3-114aed4f8414" path="/var/lib/kubelet/pods/ae045732-f556-4808-bcd3-114aed4f8414/volumes" Mar 10 15:13:41 crc kubenswrapper[4795]: I0310 15:13:41.486681 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" path="/var/lib/kubelet/pods/b06a76ed-56f2-47d3-a6f0-3f1f889e77a9/volumes" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.133525 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.225550 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" event={"ID":"e285d423-5d70-4e87-aed9-13bf768889ec","Type":"ContainerStarted","Data":"2d8d107fe0de80c781be1c2a80e2eb0e2f9069d04e70ec4fa443337eeeafb4f5"} Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.225609 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" event={"ID":"e285d423-5d70-4e87-aed9-13bf768889ec","Type":"ContainerStarted","Data":"e591a036a221450ef83708fcceb26e09e4a19257f1ff185cdf752059a9907f8a"} Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.227100 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.229424 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.237962 4795 generic.go:334] "Generic (PLEG): container finished" podID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerID="782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7" exitCode=0 Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.237999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerDied","Data":"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7"} Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.238007 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rltg9" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.238032 4795 scope.go:117] "RemoveContainer" containerID="782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.238020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rltg9" event={"ID":"dbf76fb0-f29e-4d22-ab17-d57b93755cc6","Type":"ContainerDied","Data":"c3bcbd32d74bd51adb26c21cbdcf0562c0521721b518f67232b5af84c5a92fd8"} Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.263498 4795 scope.go:117] "RemoveContainer" containerID="8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.267447 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n48hs" podStartSLOduration=2.26742803 podStartE2EDuration="2.26742803s" podCreationTimestamp="2026-03-10 15:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:13:42.252423769 +0000 UTC m=+455.418164667" watchObservedRunningTime="2026-03-10 15:13:42.26742803 +0000 UTC m=+455.433168928" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.306428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content\") pod \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.306503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh\") pod \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.306552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities\") pod \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\" (UID: \"dbf76fb0-f29e-4d22-ab17-d57b93755cc6\") " Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.312832 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities" (OuterVolumeSpecName: "utilities") pod "dbf76fb0-f29e-4d22-ab17-d57b93755cc6" (UID: "dbf76fb0-f29e-4d22-ab17-d57b93755cc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.319232 4795 scope.go:117] "RemoveContainer" containerID="547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.343370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh" (OuterVolumeSpecName: "kube-api-access-dmqhh") pod "dbf76fb0-f29e-4d22-ab17-d57b93755cc6" (UID: "dbf76fb0-f29e-4d22-ab17-d57b93755cc6"). InnerVolumeSpecName "kube-api-access-dmqhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.368169 4795 scope.go:117] "RemoveContainer" containerID="782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.368544 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7\": container with ID starting with 782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7 not found: ID does not exist" containerID="782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.368588 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7"} err="failed to get container status \"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7\": rpc error: code = NotFound desc = could not find container \"782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7\": container with ID starting with 782641471b03446f356008f42d0086eeb4c216b2f87dd2bd5b77086ec56d2aa7 not found: ID does not exist" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.368635 4795 scope.go:117] "RemoveContainer" containerID="8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.368897 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864\": container with ID starting with 8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864 not found: ID does not exist" containerID="8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.368918 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864"} err="failed to get container status \"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864\": rpc error: code = NotFound desc = could not find container \"8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864\": container with ID starting with 8e281d5d55ab381dbe4535456fb75c98a0a6d252ec64a0e19e926f500dcae864 not found: ID does not exist" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.368935 4795 scope.go:117] "RemoveContainer" containerID="547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.369159 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743\": container with ID starting with 547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743 not found: ID does not exist" containerID="547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.369177 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743"} err="failed to get container status \"547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743\": rpc error: code = NotFound desc = could not find container \"547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743\": container with ID starting with 547c5d64a5064f0b50d7cc216b195e7d02f1440815b57ac4804d3ed5db5c3743 not found: ID does not exist" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.408059 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmqhh\" (UniqueName: \"kubernetes.io/projected/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-kube-api-access-dmqhh\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.408102 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.473826 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbf76fb0-f29e-4d22-ab17-d57b93755cc6" (UID: "dbf76fb0-f29e-4d22-ab17-d57b93755cc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.509114 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbf76fb0-f29e-4d22-ab17-d57b93755cc6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.570230 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.575917 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rltg9"] Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.622772 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pj98w"] Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623024 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623038 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623048 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623054 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623085 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623093 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623104 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623110 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623130 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623136 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="extract-content" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623143 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623148 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623155 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623161 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623171 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623177 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623184 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623190 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623198 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623203 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623212 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623218 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="extract-utilities" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623228 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623347 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c07b9b-efc7-4610-85d2-21e44611aa32" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623390 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623400 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae045732-f556-4808-bcd3-114aed4f8414" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623409 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623415 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac91d8b-c0bc-4758-91d3-9fa275d88e02" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623423 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" containerName="registry-server" Mar 10 15:13:42 crc kubenswrapper[4795]: E0310 15:13:42.623592 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.623600 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06a76ed-56f2-47d3-a6f0-3f1f889e77a9" containerName="marketplace-operator" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.624403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.628851 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.632095 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj98w"] Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.711229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dcqx\" (UniqueName: \"kubernetes.io/projected/fb933310-8cd4-41f5-8a2e-2956f51956e1-kube-api-access-6dcqx\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.711278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-utilities\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.711310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-catalog-content\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.812739 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dcqx\" (UniqueName: \"kubernetes.io/projected/fb933310-8cd4-41f5-8a2e-2956f51956e1-kube-api-access-6dcqx\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.812815 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-utilities\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.812865 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-catalog-content\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.813683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-catalog-content\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.813962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb933310-8cd4-41f5-8a2e-2956f51956e1-utilities\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.832124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dcqx\" (UniqueName: \"kubernetes.io/projected/fb933310-8cd4-41f5-8a2e-2956f51956e1-kube-api-access-6dcqx\") pod \"redhat-marketplace-pj98w\" (UID: \"fb933310-8cd4-41f5-8a2e-2956f51956e1\") " pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:42 crc kubenswrapper[4795]: I0310 15:13:42.941301 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:43 crc kubenswrapper[4795]: I0310 15:13:43.182120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pj98w"] Mar 10 15:13:43 crc kubenswrapper[4795]: I0310 15:13:43.249859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj98w" event={"ID":"fb933310-8cd4-41f5-8a2e-2956f51956e1","Type":"ContainerStarted","Data":"748798cf4bf19ef8c932e3a736e2cf8527f7e90d07d4d4a913ba55d9c0db3b66"} Mar 10 15:13:43 crc kubenswrapper[4795]: I0310 15:13:43.482594 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf76fb0-f29e-4d22-ab17-d57b93755cc6" path="/var/lib/kubelet/pods/dbf76fb0-f29e-4d22-ab17-d57b93755cc6/volumes" Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.257396 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb933310-8cd4-41f5-8a2e-2956f51956e1" containerID="c6b93264da9714c4c81ecc7e49b0f4f4c033e34b7ae4785f98241e040fc182e5" exitCode=0 Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.257473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj98w" event={"ID":"fb933310-8cd4-41f5-8a2e-2956f51956e1","Type":"ContainerDied","Data":"c6b93264da9714c4c81ecc7e49b0f4f4c033e34b7ae4785f98241e040fc182e5"} Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.830882 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9zdl2"] Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.832062 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.835852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.858692 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zdl2"] Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.962712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-catalog-content\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.963183 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxvrj\" (UniqueName: \"kubernetes.io/projected/1c92a54b-5890-4e03-8c5a-c02f308fa42c-kube-api-access-kxvrj\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:44 crc kubenswrapper[4795]: I0310 15:13:44.963703 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-utilities\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.037718 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qxkhv"] Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.039884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.042931 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.050648 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxkhv"] Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.064968 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-catalog-content\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.065072 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxvrj\" (UniqueName: \"kubernetes.io/projected/1c92a54b-5890-4e03-8c5a-c02f308fa42c-kube-api-access-kxvrj\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.065133 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-utilities\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.065633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-utilities\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.066550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c92a54b-5890-4e03-8c5a-c02f308fa42c-catalog-content\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.088868 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxvrj\" (UniqueName: \"kubernetes.io/projected/1c92a54b-5890-4e03-8c5a-c02f308fa42c-kube-api-access-kxvrj\") pod \"certified-operators-9zdl2\" (UID: \"1c92a54b-5890-4e03-8c5a-c02f308fa42c\") " pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.166297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-catalog-content\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.166345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspw6\" (UniqueName: \"kubernetes.io/projected/2290378c-c594-45e6-9436-0bff75e32af9-kube-api-access-vspw6\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.166384 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-utilities\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.177313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.267584 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-catalog-content\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.267861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspw6\" (UniqueName: \"kubernetes.io/projected/2290378c-c594-45e6-9436-0bff75e32af9-kube-api-access-vspw6\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.267903 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-utilities\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.268354 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-catalog-content\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.268414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2290378c-c594-45e6-9436-0bff75e32af9-utilities\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.270411 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj98w" event={"ID":"fb933310-8cd4-41f5-8a2e-2956f51956e1","Type":"ContainerStarted","Data":"7b7a04114c003ddc4c1243c8aa4ea6e8715142a90f2c29c7bfa415b45abd6868"} Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.290778 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspw6\" (UniqueName: \"kubernetes.io/projected/2290378c-c594-45e6-9436-0bff75e32af9-kube-api-access-vspw6\") pod \"community-operators-qxkhv\" (UID: \"2290378c-c594-45e6-9436-0bff75e32af9\") " pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.380284 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.646295 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9zdl2"] Mar 10 15:13:45 crc kubenswrapper[4795]: W0310 15:13:45.656198 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c92a54b_5890_4e03_8c5a_c02f308fa42c.slice/crio-3bf64768e39a8149c4c3d8cd5425e5f4e5ebf4c1ec5d5938bde9b1ecd5868c08 WatchSource:0}: Error finding container 3bf64768e39a8149c4c3d8cd5425e5f4e5ebf4c1ec5d5938bde9b1ecd5868c08: Status 404 returned error can't find the container with id 3bf64768e39a8149c4c3d8cd5425e5f4e5ebf4c1ec5d5938bde9b1ecd5868c08 Mar 10 15:13:45 crc kubenswrapper[4795]: I0310 15:13:45.771987 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qxkhv"] Mar 10 15:13:45 crc kubenswrapper[4795]: W0310 15:13:45.779280 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2290378c_c594_45e6_9436_0bff75e32af9.slice/crio-1348df1ec2a17bfa16fc877e30be86850b4aaa79972d238751156609eb2e9474 WatchSource:0}: Error finding container 1348df1ec2a17bfa16fc877e30be86850b4aaa79972d238751156609eb2e9474: Status 404 returned error can't find the container with id 1348df1ec2a17bfa16fc877e30be86850b4aaa79972d238751156609eb2e9474 Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.279012 4795 generic.go:334] "Generic (PLEG): container finished" podID="2290378c-c594-45e6-9436-0bff75e32af9" containerID="6d92edb80f91b96f979da0d7c4420e4c8fd0a06254620436dfc997ac265bf0b8" exitCode=0 Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.279189 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkhv" event={"ID":"2290378c-c594-45e6-9436-0bff75e32af9","Type":"ContainerDied","Data":"6d92edb80f91b96f979da0d7c4420e4c8fd0a06254620436dfc997ac265bf0b8"} Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.279243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkhv" event={"ID":"2290378c-c594-45e6-9436-0bff75e32af9","Type":"ContainerStarted","Data":"1348df1ec2a17bfa16fc877e30be86850b4aaa79972d238751156609eb2e9474"} Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.283550 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c92a54b-5890-4e03-8c5a-c02f308fa42c" containerID="11f4fdd9813296c3205f88203467fbdecc5f703152809e6a62f66f7fc59b37ea" exitCode=0 Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.283694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdl2" event={"ID":"1c92a54b-5890-4e03-8c5a-c02f308fa42c","Type":"ContainerDied","Data":"11f4fdd9813296c3205f88203467fbdecc5f703152809e6a62f66f7fc59b37ea"} Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.283749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdl2" event={"ID":"1c92a54b-5890-4e03-8c5a-c02f308fa42c","Type":"ContainerStarted","Data":"3bf64768e39a8149c4c3d8cd5425e5f4e5ebf4c1ec5d5938bde9b1ecd5868c08"} Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.288204 4795 generic.go:334] "Generic (PLEG): container finished" podID="fb933310-8cd4-41f5-8a2e-2956f51956e1" containerID="7b7a04114c003ddc4c1243c8aa4ea6e8715142a90f2c29c7bfa415b45abd6868" exitCode=0 Mar 10 15:13:46 crc kubenswrapper[4795]: I0310 15:13:46.288239 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj98w" event={"ID":"fb933310-8cd4-41f5-8a2e-2956f51956e1","Type":"ContainerDied","Data":"7b7a04114c003ddc4c1243c8aa4ea6e8715142a90f2c29c7bfa415b45abd6868"} Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.222696 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vn6b2"] Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.223906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.226442 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.239516 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vn6b2"] Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.306419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pj98w" event={"ID":"fb933310-8cd4-41f5-8a2e-2956f51956e1","Type":"ContainerStarted","Data":"a35060212dc19541e516541b8160af25766a44b38f8ee0de3584f26bc169aef8"} Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.308763 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkhv" event={"ID":"2290378c-c594-45e6-9436-0bff75e32af9","Type":"ContainerStarted","Data":"22c3d507f58221a6615ffe15b488197fe105b9138b0247000bf6f2b08dac6993"} Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.309727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-catalog-content\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.309828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-utilities\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.309876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb2gp\" (UniqueName: \"kubernetes.io/projected/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-kube-api-access-cb2gp\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.323959 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pj98w" podStartSLOduration=2.8902557399999997 podStartE2EDuration="5.323931706s" podCreationTimestamp="2026-03-10 15:13:42 +0000 UTC" firstStartedPulling="2026-03-10 15:13:44.261298743 +0000 UTC m=+457.427039671" lastFinishedPulling="2026-03-10 15:13:46.694974709 +0000 UTC m=+459.860715637" observedRunningTime="2026-03-10 15:13:47.322301619 +0000 UTC m=+460.488042527" watchObservedRunningTime="2026-03-10 15:13:47.323931706 +0000 UTC m=+460.489672604" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.410975 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-utilities\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.411423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-utilities\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.411560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb2gp\" (UniqueName: \"kubernetes.io/projected/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-kube-api-access-cb2gp\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.411825 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-catalog-content\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.412141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-catalog-content\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.457005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb2gp\" (UniqueName: \"kubernetes.io/projected/dade6b65-b7ed-45c3-bf75-28e8d62d94c6-kube-api-access-cb2gp\") pod \"redhat-operators-vn6b2\" (UID: \"dade6b65-b7ed-45c3-bf75-28e8d62d94c6\") " pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:47 crc kubenswrapper[4795]: I0310 15:13:47.541650 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:47.993698 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vn6b2"] Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.324908 4795 generic.go:334] "Generic (PLEG): container finished" podID="2290378c-c594-45e6-9436-0bff75e32af9" containerID="22c3d507f58221a6615ffe15b488197fe105b9138b0247000bf6f2b08dac6993" exitCode=0 Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.324968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkhv" event={"ID":"2290378c-c594-45e6-9436-0bff75e32af9","Type":"ContainerDied","Data":"22c3d507f58221a6615ffe15b488197fe105b9138b0247000bf6f2b08dac6993"} Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.328534 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c92a54b-5890-4e03-8c5a-c02f308fa42c" containerID="f5326f02d39852cfb13793898dd5252bc7d6c45cd4780c8d00c18256591919ce" exitCode=0 Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.329047 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdl2" event={"ID":"1c92a54b-5890-4e03-8c5a-c02f308fa42c","Type":"ContainerDied","Data":"f5326f02d39852cfb13793898dd5252bc7d6c45cd4780c8d00c18256591919ce"} Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.334099 4795 generic.go:334] "Generic (PLEG): container finished" podID="dade6b65-b7ed-45c3-bf75-28e8d62d94c6" containerID="85817207389b081aed66d64250a258caa818d4e1186e54b0e487fea35e8532f6" exitCode=0 Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.335209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn6b2" event={"ID":"dade6b65-b7ed-45c3-bf75-28e8d62d94c6","Type":"ContainerDied","Data":"85817207389b081aed66d64250a258caa818d4e1186e54b0e487fea35e8532f6"} Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.335238 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn6b2" event={"ID":"dade6b65-b7ed-45c3-bf75-28e8d62d94c6","Type":"ContainerStarted","Data":"2a47e822f1fa90c092114cd056886cc1f9efa997a18948875a0bf4de02e4f361"} Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.539394 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:13:48 crc kubenswrapper[4795]: I0310 15:13:48.539530 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:13:49 crc kubenswrapper[4795]: I0310 15:13:49.343637 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9zdl2" event={"ID":"1c92a54b-5890-4e03-8c5a-c02f308fa42c","Type":"ContainerStarted","Data":"b11bc86d60f4f85b75ef8e90e3672afd6f496460e923c394acc7336bd8637402"} Mar 10 15:13:50 crc kubenswrapper[4795]: I0310 15:13:50.350693 4795 generic.go:334] "Generic (PLEG): container finished" podID="dade6b65-b7ed-45c3-bf75-28e8d62d94c6" containerID="8bcd6d33e185e6805bc0ba732f0eb76ef3f17ad9375e289f84336afa6a0026da" exitCode=0 Mar 10 15:13:50 crc kubenswrapper[4795]: I0310 15:13:50.350756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn6b2" event={"ID":"dade6b65-b7ed-45c3-bf75-28e8d62d94c6","Type":"ContainerDied","Data":"8bcd6d33e185e6805bc0ba732f0eb76ef3f17ad9375e289f84336afa6a0026da"} Mar 10 15:13:50 crc kubenswrapper[4795]: I0310 15:13:50.357176 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qxkhv" event={"ID":"2290378c-c594-45e6-9436-0bff75e32af9","Type":"ContainerStarted","Data":"aa2fd0666ec5d04b0150734f5f18e85e20541da8dac938c79473298bff8fdc06"} Mar 10 15:13:50 crc kubenswrapper[4795]: I0310 15:13:50.371547 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9zdl2" podStartSLOduration=3.868256442 podStartE2EDuration="6.371528127s" podCreationTimestamp="2026-03-10 15:13:44 +0000 UTC" firstStartedPulling="2026-03-10 15:13:46.292394836 +0000 UTC m=+459.458135774" lastFinishedPulling="2026-03-10 15:13:48.795666521 +0000 UTC m=+461.961407459" observedRunningTime="2026-03-10 15:13:49.372759448 +0000 UTC m=+462.538500346" watchObservedRunningTime="2026-03-10 15:13:50.371528127 +0000 UTC m=+463.537269035" Mar 10 15:13:50 crc kubenswrapper[4795]: I0310 15:13:50.389475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qxkhv" podStartSLOduration=2.488510253 podStartE2EDuration="5.389457062s" podCreationTimestamp="2026-03-10 15:13:45 +0000 UTC" firstStartedPulling="2026-03-10 15:13:46.282527562 +0000 UTC m=+459.448268500" lastFinishedPulling="2026-03-10 15:13:49.183474371 +0000 UTC m=+462.349215309" observedRunningTime="2026-03-10 15:13:50.386452125 +0000 UTC m=+463.552193023" watchObservedRunningTime="2026-03-10 15:13:50.389457062 +0000 UTC m=+463.555197960" Mar 10 15:13:51 crc kubenswrapper[4795]: I0310 15:13:51.365650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vn6b2" event={"ID":"dade6b65-b7ed-45c3-bf75-28e8d62d94c6","Type":"ContainerStarted","Data":"0f1a4c63635007debdffd751aee2d7842579fd14a61d7cf1d6f5badd1ba1c347"} Mar 10 15:13:51 crc kubenswrapper[4795]: I0310 15:13:51.395574 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vn6b2" podStartSLOduration=1.8662245290000001 podStartE2EDuration="4.395550092s" podCreationTimestamp="2026-03-10 15:13:47 +0000 UTC" firstStartedPulling="2026-03-10 15:13:48.335593206 +0000 UTC m=+461.501334114" lastFinishedPulling="2026-03-10 15:13:50.864918769 +0000 UTC m=+464.030659677" observedRunningTime="2026-03-10 15:13:51.394637976 +0000 UTC m=+464.560378874" watchObservedRunningTime="2026-03-10 15:13:51.395550092 +0000 UTC m=+464.561290990" Mar 10 15:13:52 crc kubenswrapper[4795]: I0310 15:13:52.941632 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:52 crc kubenswrapper[4795]: I0310 15:13:52.941718 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:53 crc kubenswrapper[4795]: I0310 15:13:53.012656 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:53 crc kubenswrapper[4795]: I0310 15:13:53.421115 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pj98w" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.177601 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.177932 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.224668 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.381327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.381369 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.433791 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.452115 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9zdl2" Mar 10 15:13:55 crc kubenswrapper[4795]: I0310 15:13:55.475321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qxkhv" Mar 10 15:13:57 crc kubenswrapper[4795]: I0310 15:13:57.542397 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:57 crc kubenswrapper[4795]: I0310 15:13:57.542445 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:57 crc kubenswrapper[4795]: I0310 15:13:57.580638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:58 crc kubenswrapper[4795]: I0310 15:13:58.486135 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vn6b2" Mar 10 15:13:58 crc kubenswrapper[4795]: I0310 15:13:58.891591 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-9h4n2" Mar 10 15:13:58 crc kubenswrapper[4795]: I0310 15:13:58.957857 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.176642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552594-c8s55"] Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.177461 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.179432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.180598 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.181750 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.190566 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-c8s55"] Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.210092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6twm\" (UniqueName: \"kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm\") pod \"auto-csr-approver-29552594-c8s55\" (UID: \"01f61473-d3ef-4f22-8f44-abb0b66b8a77\") " pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.311429 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6twm\" (UniqueName: \"kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm\") pod \"auto-csr-approver-29552594-c8s55\" (UID: \"01f61473-d3ef-4f22-8f44-abb0b66b8a77\") " pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.338047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6twm\" (UniqueName: \"kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm\") pod \"auto-csr-approver-29552594-c8s55\" (UID: \"01f61473-d3ef-4f22-8f44-abb0b66b8a77\") " pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.500652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:00 crc kubenswrapper[4795]: I0310 15:14:00.957467 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-c8s55"] Mar 10 15:14:00 crc kubenswrapper[4795]: W0310 15:14:00.963373 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01f61473_d3ef_4f22_8f44_abb0b66b8a77.slice/crio-ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd WatchSource:0}: Error finding container ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd: Status 404 returned error can't find the container with id ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd Mar 10 15:14:01 crc kubenswrapper[4795]: I0310 15:14:01.434804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-c8s55" event={"ID":"01f61473-d3ef-4f22-8f44-abb0b66b8a77","Type":"ContainerStarted","Data":"ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd"} Mar 10 15:14:02 crc kubenswrapper[4795]: I0310 15:14:02.441702 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-c8s55" event={"ID":"01f61473-d3ef-4f22-8f44-abb0b66b8a77","Type":"ContainerStarted","Data":"33ebbf35641f5d193e21a31217a65a594ef5a2f13ee57f6b29005868e3d0047e"} Mar 10 15:14:02 crc kubenswrapper[4795]: I0310 15:14:02.456500 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552594-c8s55" podStartSLOduration=1.471794748 podStartE2EDuration="2.456486374s" podCreationTimestamp="2026-03-10 15:14:00 +0000 UTC" firstStartedPulling="2026-03-10 15:14:00.966223626 +0000 UTC m=+474.131964544" lastFinishedPulling="2026-03-10 15:14:01.950915242 +0000 UTC m=+475.116656170" observedRunningTime="2026-03-10 15:14:02.45459745 +0000 UTC m=+475.620338358" watchObservedRunningTime="2026-03-10 15:14:02.456486374 +0000 UTC m=+475.622227262" Mar 10 15:14:03 crc kubenswrapper[4795]: I0310 15:14:03.452172 4795 generic.go:334] "Generic (PLEG): container finished" podID="01f61473-d3ef-4f22-8f44-abb0b66b8a77" containerID="33ebbf35641f5d193e21a31217a65a594ef5a2f13ee57f6b29005868e3d0047e" exitCode=0 Mar 10 15:14:03 crc kubenswrapper[4795]: I0310 15:14:03.452268 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-c8s55" event={"ID":"01f61473-d3ef-4f22-8f44-abb0b66b8a77","Type":"ContainerDied","Data":"33ebbf35641f5d193e21a31217a65a594ef5a2f13ee57f6b29005868e3d0047e"} Mar 10 15:14:04 crc kubenswrapper[4795]: I0310 15:14:04.728239 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:04 crc kubenswrapper[4795]: I0310 15:14:04.779855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6twm\" (UniqueName: \"kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm\") pod \"01f61473-d3ef-4f22-8f44-abb0b66b8a77\" (UID: \"01f61473-d3ef-4f22-8f44-abb0b66b8a77\") " Mar 10 15:14:04 crc kubenswrapper[4795]: I0310 15:14:04.788645 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm" (OuterVolumeSpecName: "kube-api-access-f6twm") pod "01f61473-d3ef-4f22-8f44-abb0b66b8a77" (UID: "01f61473-d3ef-4f22-8f44-abb0b66b8a77"). InnerVolumeSpecName "kube-api-access-f6twm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:04 crc kubenswrapper[4795]: I0310 15:14:04.881281 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6twm\" (UniqueName: \"kubernetes.io/projected/01f61473-d3ef-4f22-8f44-abb0b66b8a77-kube-api-access-f6twm\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:05 crc kubenswrapper[4795]: I0310 15:14:05.468626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552594-c8s55" event={"ID":"01f61473-d3ef-4f22-8f44-abb0b66b8a77","Type":"ContainerDied","Data":"ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd"} Mar 10 15:14:05 crc kubenswrapper[4795]: I0310 15:14:05.468676 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca6ec4712ae80efbb722d8bcccdf856cac307c795af3f7a6cd650603959837cd" Mar 10 15:14:05 crc kubenswrapper[4795]: I0310 15:14:05.468703 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552594-c8s55" Mar 10 15:14:05 crc kubenswrapper[4795]: I0310 15:14:05.523166 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-zdmvt"] Mar 10 15:14:05 crc kubenswrapper[4795]: I0310 15:14:05.526505 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552588-zdmvt"] Mar 10 15:14:07 crc kubenswrapper[4795]: I0310 15:14:07.486545 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c868aa84-d232-4d80-bff3-d9e0aa659769" path="/var/lib/kubelet/pods/c868aa84-d232-4d80-bff3-d9e0aa659769/volumes" Mar 10 15:14:18 crc kubenswrapper[4795]: I0310 15:14:18.539126 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:14:18 crc kubenswrapper[4795]: I0310 15:14:18.539815 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:14:18 crc kubenswrapper[4795]: I0310 15:14:18.539874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:14:18 crc kubenswrapper[4795]: I0310 15:14:18.540559 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:14:18 crc kubenswrapper[4795]: I0310 15:14:18.540645 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97" gracePeriod=600 Mar 10 15:14:19 crc kubenswrapper[4795]: I0310 15:14:19.545612 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97" exitCode=0 Mar 10 15:14:19 crc kubenswrapper[4795]: I0310 15:14:19.545686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97"} Mar 10 15:14:19 crc kubenswrapper[4795]: I0310 15:14:19.546260 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697"} Mar 10 15:14:19 crc kubenswrapper[4795]: I0310 15:14:19.546278 4795 scope.go:117] "RemoveContainer" containerID="adb3ecafbb19be467cfc11fdec1d33c5762d7422ebfbc9dd0893d8375f2e6a51" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.002410 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" podUID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" containerName="registry" containerID="cri-o://c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e" gracePeriod=30 Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.375805 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429527 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw8mb\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429598 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429766 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.429868 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates\") pod \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\" (UID: \"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16\") " Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.431063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.431125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.436311 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.436903 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb" (OuterVolumeSpecName: "kube-api-access-rw8mb") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "kube-api-access-rw8mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.437141 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.437369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.441574 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.469605 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" (UID: "2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.530956 4795 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.530993 4795 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.531006 4795 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.531018 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.531030 4795 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.531042 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw8mb\" (UniqueName: \"kubernetes.io/projected/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-kube-api-access-rw8mb\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.531053 4795 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.578706 4795 generic.go:334] "Generic (PLEG): container finished" podID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" containerID="c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e" exitCode=0 Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.578748 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" event={"ID":"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16","Type":"ContainerDied","Data":"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e"} Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.578774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" event={"ID":"2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16","Type":"ContainerDied","Data":"2563dbac85073daa3fec6c680cc18882fd6fd7f4d86394af5b2ddd43e5ed721c"} Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.578789 4795 scope.go:117] "RemoveContainer" containerID="c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.578873 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vn4gm" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.608097 4795 scope.go:117] "RemoveContainer" containerID="c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e" Mar 10 15:14:24 crc kubenswrapper[4795]: E0310 15:14:24.608439 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e\": container with ID starting with c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e not found: ID does not exist" containerID="c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.608466 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e"} err="failed to get container status \"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e\": rpc error: code = NotFound desc = could not find container \"c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e\": container with ID starting with c4191eac83b288c0d87708361e9ee699bac2d1b6a7a9af404954ea4fc1947e5e not found: ID does not exist" Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.609614 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:14:24 crc kubenswrapper[4795]: I0310 15:14:24.611622 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vn4gm"] Mar 10 15:14:25 crc kubenswrapper[4795]: I0310 15:14:25.483416 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" path="/var/lib/kubelet/pods/2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16/volumes" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.155328 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t"] Mar 10 15:15:00 crc kubenswrapper[4795]: E0310 15:15:00.156286 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01f61473-d3ef-4f22-8f44-abb0b66b8a77" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.156312 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01f61473-d3ef-4f22-8f44-abb0b66b8a77" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4795]: E0310 15:15:00.156342 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" containerName="registry" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.156355 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" containerName="registry" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.156512 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01f61473-d3ef-4f22-8f44-abb0b66b8a77" containerName="oc" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.156531 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2243fa0e-0a9a-4b9e-b0af-79f1e86a5b16" containerName="registry" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.157133 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.163380 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.163817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.169710 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t"] Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.177152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.177235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.177298 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp4m2\" (UniqueName: \"kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.278598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.278652 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.278689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp4m2\" (UniqueName: \"kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.280566 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.287524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.300047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp4m2\" (UniqueName: \"kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2\") pod \"collect-profiles-29552595-lbf4t\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.519402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:00 crc kubenswrapper[4795]: I0310 15:15:00.746445 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t"] Mar 10 15:15:01 crc kubenswrapper[4795]: I0310 15:15:01.116206 4795 generic.go:334] "Generic (PLEG): container finished" podID="34cd1574-f40a-4b09-b79a-8bd20a4d9698" containerID="50a6ad604734c081e6c96830706a5a9f07eb3cb3311c9e768206e61cf4876ea3" exitCode=0 Mar 10 15:15:01 crc kubenswrapper[4795]: I0310 15:15:01.116324 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" event={"ID":"34cd1574-f40a-4b09-b79a-8bd20a4d9698","Type":"ContainerDied","Data":"50a6ad604734c081e6c96830706a5a9f07eb3cb3311c9e768206e61cf4876ea3"} Mar 10 15:15:01 crc kubenswrapper[4795]: I0310 15:15:01.116635 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" event={"ID":"34cd1574-f40a-4b09-b79a-8bd20a4d9698","Type":"ContainerStarted","Data":"e4ef2983519343a9b2c877ade9565786af063cef887fac8b2f0e25cd8b7ad27b"} Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.410087 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.509438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume\") pod \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.509513 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp4m2\" (UniqueName: \"kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2\") pod \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.509585 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume\") pod \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\" (UID: \"34cd1574-f40a-4b09-b79a-8bd20a4d9698\") " Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.510721 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume" (OuterVolumeSpecName: "config-volume") pod "34cd1574-f40a-4b09-b79a-8bd20a4d9698" (UID: "34cd1574-f40a-4b09-b79a-8bd20a4d9698"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.517871 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "34cd1574-f40a-4b09-b79a-8bd20a4d9698" (UID: "34cd1574-f40a-4b09-b79a-8bd20a4d9698"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.518275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2" (OuterVolumeSpecName: "kube-api-access-tp4m2") pod "34cd1574-f40a-4b09-b79a-8bd20a4d9698" (UID: "34cd1574-f40a-4b09-b79a-8bd20a4d9698"). InnerVolumeSpecName "kube-api-access-tp4m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.610990 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/34cd1574-f40a-4b09-b79a-8bd20a4d9698-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.611020 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp4m2\" (UniqueName: \"kubernetes.io/projected/34cd1574-f40a-4b09-b79a-8bd20a4d9698-kube-api-access-tp4m2\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:02 crc kubenswrapper[4795]: I0310 15:15:02.611029 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/34cd1574-f40a-4b09-b79a-8bd20a4d9698-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:15:03 crc kubenswrapper[4795]: I0310 15:15:03.133656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" event={"ID":"34cd1574-f40a-4b09-b79a-8bd20a4d9698","Type":"ContainerDied","Data":"e4ef2983519343a9b2c877ade9565786af063cef887fac8b2f0e25cd8b7ad27b"} Mar 10 15:15:03 crc kubenswrapper[4795]: I0310 15:15:03.134111 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4ef2983519343a9b2c877ade9565786af063cef887fac8b2f0e25cd8b7ad27b" Mar 10 15:15:03 crc kubenswrapper[4795]: I0310 15:15:03.133818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.143257 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552596-vvqwp"] Mar 10 15:16:00 crc kubenswrapper[4795]: E0310 15:16:00.143861 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cd1574-f40a-4b09-b79a-8bd20a4d9698" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.143874 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cd1574-f40a-4b09-b79a-8bd20a4d9698" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.143965 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="34cd1574-f40a-4b09-b79a-8bd20a4d9698" containerName="collect-profiles" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.144303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.146822 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.147124 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.147128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.160961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-vvqwp"] Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.219916 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2q96\" (UniqueName: \"kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96\") pod \"auto-csr-approver-29552596-vvqwp\" (UID: \"eb92169a-5109-4bf5-85e4-313837d438d4\") " pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.321252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2q96\" (UniqueName: \"kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96\") pod \"auto-csr-approver-29552596-vvqwp\" (UID: \"eb92169a-5109-4bf5-85e4-313837d438d4\") " pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.349520 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2q96\" (UniqueName: \"kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96\") pod \"auto-csr-approver-29552596-vvqwp\" (UID: \"eb92169a-5109-4bf5-85e4-313837d438d4\") " pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:00 crc kubenswrapper[4795]: I0310 15:16:00.475502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:01 crc kubenswrapper[4795]: I0310 15:16:01.001904 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-vvqwp"] Mar 10 15:16:01 crc kubenswrapper[4795]: I0310 15:16:01.022209 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:16:01 crc kubenswrapper[4795]: I0310 15:16:01.552136 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" event={"ID":"eb92169a-5109-4bf5-85e4-313837d438d4","Type":"ContainerStarted","Data":"17cedfa90962c6f2e60fd99548ad5d18516073882a08af6e872ca9ce454ea67c"} Mar 10 15:16:02 crc kubenswrapper[4795]: I0310 15:16:02.564754 4795 generic.go:334] "Generic (PLEG): container finished" podID="eb92169a-5109-4bf5-85e4-313837d438d4" containerID="26b6215e3d8f46effa1b154c6dc341ea80b349d32bb0d7b5ca65a972d6372eb5" exitCode=0 Mar 10 15:16:02 crc kubenswrapper[4795]: I0310 15:16:02.564840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" event={"ID":"eb92169a-5109-4bf5-85e4-313837d438d4","Type":"ContainerDied","Data":"26b6215e3d8f46effa1b154c6dc341ea80b349d32bb0d7b5ca65a972d6372eb5"} Mar 10 15:16:03 crc kubenswrapper[4795]: I0310 15:16:03.879328 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:03 crc kubenswrapper[4795]: I0310 15:16:03.984125 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2q96\" (UniqueName: \"kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96\") pod \"eb92169a-5109-4bf5-85e4-313837d438d4\" (UID: \"eb92169a-5109-4bf5-85e4-313837d438d4\") " Mar 10 15:16:03 crc kubenswrapper[4795]: I0310 15:16:03.993612 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96" (OuterVolumeSpecName: "kube-api-access-d2q96") pod "eb92169a-5109-4bf5-85e4-313837d438d4" (UID: "eb92169a-5109-4bf5-85e4-313837d438d4"). InnerVolumeSpecName "kube-api-access-d2q96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.085797 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2q96\" (UniqueName: \"kubernetes.io/projected/eb92169a-5109-4bf5-85e4-313837d438d4-kube-api-access-d2q96\") on node \"crc\" DevicePath \"\"" Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.590565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" event={"ID":"eb92169a-5109-4bf5-85e4-313837d438d4","Type":"ContainerDied","Data":"17cedfa90962c6f2e60fd99548ad5d18516073882a08af6e872ca9ce454ea67c"} Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.590635 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17cedfa90962c6f2e60fd99548ad5d18516073882a08af6e872ca9ce454ea67c" Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.590734 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552596-vvqwp" Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.950767 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2x5qv"] Mar 10 15:16:04 crc kubenswrapper[4795]: I0310 15:16:04.956786 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552590-2x5qv"] Mar 10 15:16:05 crc kubenswrapper[4795]: I0310 15:16:05.490317 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d15efc4-4cee-459e-b10d-e0452d172fc7" path="/var/lib/kubelet/pods/4d15efc4-4cee-459e-b10d-e0452d172fc7/volumes" Mar 10 15:16:18 crc kubenswrapper[4795]: I0310 15:16:18.539419 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:16:18 crc kubenswrapper[4795]: I0310 15:16:18.540231 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:16:48 crc kubenswrapper[4795]: I0310 15:16:48.539928 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:16:48 crc kubenswrapper[4795]: I0310 15:16:48.540484 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:17:18 crc kubenswrapper[4795]: I0310 15:17:18.538980 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:17:18 crc kubenswrapper[4795]: I0310 15:17:18.539863 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:17:18 crc kubenswrapper[4795]: I0310 15:17:18.539941 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:17:18 crc kubenswrapper[4795]: I0310 15:17:18.540895 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:17:18 crc kubenswrapper[4795]: I0310 15:17:18.541044 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697" gracePeriod=600 Mar 10 15:17:19 crc kubenswrapper[4795]: I0310 15:17:19.185818 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697" exitCode=0 Mar 10 15:17:19 crc kubenswrapper[4795]: I0310 15:17:19.185887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697"} Mar 10 15:17:19 crc kubenswrapper[4795]: I0310 15:17:19.186782 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f"} Mar 10 15:17:19 crc kubenswrapper[4795]: I0310 15:17:19.186839 4795 scope.go:117] "RemoveContainer" containerID="5ce3669fd9d20dd9ef4d0ed6efd59f873656df37d63a73174e09871244bd7b97" Mar 10 15:17:31 crc kubenswrapper[4795]: I0310 15:17:31.198328 4795 scope.go:117] "RemoveContainer" containerID="1573f4494b45acf52f25b49054853463a851d43ab58114956372b485cc5c25a5" Mar 10 15:17:31 crc kubenswrapper[4795]: I0310 15:17:31.251591 4795 scope.go:117] "RemoveContainer" containerID="4a08a8498382e5fe80791676789605377d90f2720bba1f48f27e953514654dc4" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.163559 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552598-4zw7j"] Mar 10 15:18:00 crc kubenswrapper[4795]: E0310 15:18:00.164973 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb92169a-5109-4bf5-85e4-313837d438d4" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.165001 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb92169a-5109-4bf5-85e4-313837d438d4" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.165383 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb92169a-5109-4bf5-85e4-313837d438d4" containerName="oc" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.166363 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.169433 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-4zw7j"] Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.171562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.171700 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.171708 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.308810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8cph\" (UniqueName: \"kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph\") pod \"auto-csr-approver-29552598-4zw7j\" (UID: \"1cf4236b-2e67-46b2-9d0b-f38d75ed5213\") " pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.409710 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8cph\" (UniqueName: \"kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph\") pod \"auto-csr-approver-29552598-4zw7j\" (UID: \"1cf4236b-2e67-46b2-9d0b-f38d75ed5213\") " pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.442842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8cph\" (UniqueName: \"kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph\") pod \"auto-csr-approver-29552598-4zw7j\" (UID: \"1cf4236b-2e67-46b2-9d0b-f38d75ed5213\") " pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.495217 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:00 crc kubenswrapper[4795]: I0310 15:18:00.704129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-4zw7j"] Mar 10 15:18:01 crc kubenswrapper[4795]: I0310 15:18:01.488942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" event={"ID":"1cf4236b-2e67-46b2-9d0b-f38d75ed5213","Type":"ContainerStarted","Data":"6b539828a3ce2ef26b934afc9e6c31d8281411dd1000e8dd3e30c5d9d870f349"} Mar 10 15:18:02 crc kubenswrapper[4795]: I0310 15:18:02.497481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" event={"ID":"1cf4236b-2e67-46b2-9d0b-f38d75ed5213","Type":"ContainerStarted","Data":"c51f599633bfc406ad50a24970472587432a834637b95b39b22fadb24e99db5e"} Mar 10 15:18:02 crc kubenswrapper[4795]: I0310 15:18:02.520045 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" podStartSLOduration=1.29361128 podStartE2EDuration="2.520027133s" podCreationTimestamp="2026-03-10 15:18:00 +0000 UTC" firstStartedPulling="2026-03-10 15:18:00.716533296 +0000 UTC m=+713.882274194" lastFinishedPulling="2026-03-10 15:18:01.942949109 +0000 UTC m=+715.108690047" observedRunningTime="2026-03-10 15:18:02.515632908 +0000 UTC m=+715.681373846" watchObservedRunningTime="2026-03-10 15:18:02.520027133 +0000 UTC m=+715.685768041" Mar 10 15:18:03 crc kubenswrapper[4795]: I0310 15:18:03.507327 4795 generic.go:334] "Generic (PLEG): container finished" podID="1cf4236b-2e67-46b2-9d0b-f38d75ed5213" containerID="c51f599633bfc406ad50a24970472587432a834637b95b39b22fadb24e99db5e" exitCode=0 Mar 10 15:18:03 crc kubenswrapper[4795]: I0310 15:18:03.507378 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" event={"ID":"1cf4236b-2e67-46b2-9d0b-f38d75ed5213","Type":"ContainerDied","Data":"c51f599633bfc406ad50a24970472587432a834637b95b39b22fadb24e99db5e"} Mar 10 15:18:04 crc kubenswrapper[4795]: I0310 15:18:04.917143 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.071820 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8cph\" (UniqueName: \"kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph\") pod \"1cf4236b-2e67-46b2-9d0b-f38d75ed5213\" (UID: \"1cf4236b-2e67-46b2-9d0b-f38d75ed5213\") " Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.079788 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph" (OuterVolumeSpecName: "kube-api-access-n8cph") pod "1cf4236b-2e67-46b2-9d0b-f38d75ed5213" (UID: "1cf4236b-2e67-46b2-9d0b-f38d75ed5213"). InnerVolumeSpecName "kube-api-access-n8cph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.175549 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8cph\" (UniqueName: \"kubernetes.io/projected/1cf4236b-2e67-46b2-9d0b-f38d75ed5213-kube-api-access-n8cph\") on node \"crc\" DevicePath \"\"" Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.535859 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" event={"ID":"1cf4236b-2e67-46b2-9d0b-f38d75ed5213","Type":"ContainerDied","Data":"6b539828a3ce2ef26b934afc9e6c31d8281411dd1000e8dd3e30c5d9d870f349"} Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.535920 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b539828a3ce2ef26b934afc9e6c31d8281411dd1000e8dd3e30c5d9d870f349" Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.535994 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552598-4zw7j" Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.591982 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-zf6h7"] Mar 10 15:18:05 crc kubenswrapper[4795]: I0310 15:18:05.598989 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552592-zf6h7"] Mar 10 15:18:05 crc kubenswrapper[4795]: E0310 15:18:05.657639 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cf4236b_2e67_46b2_9d0b_f38d75ed5213.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cf4236b_2e67_46b2_9d0b_f38d75ed5213.slice/crio-6b539828a3ce2ef26b934afc9e6c31d8281411dd1000e8dd3e30c5d9d870f349\": RecentStats: unable to find data in memory cache]" Mar 10 15:18:07 crc kubenswrapper[4795]: I0310 15:18:07.488886 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c293ad9-88b5-420b-b70f-6122d9a29d5b" path="/var/lib/kubelet/pods/0c293ad9-88b5-420b-b70f-6122d9a29d5b/volumes" Mar 10 15:18:27 crc kubenswrapper[4795]: E0310 15:18:27.099957 4795 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.624s" Mar 10 15:18:31 crc kubenswrapper[4795]: I0310 15:18:31.343293 4795 scope.go:117] "RemoveContainer" containerID="eac3adcd0deb9e0f6270e8f2ec94f4e0e58a7dbc91f979986150d1423d88175a" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.391579 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd"] Mar 10 15:19:12 crc kubenswrapper[4795]: E0310 15:19:12.392389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf4236b-2e67-46b2-9d0b-f38d75ed5213" containerName="oc" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.392406 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf4236b-2e67-46b2-9d0b-f38d75ed5213" containerName="oc" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.392523 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf4236b-2e67-46b2-9d0b-f38d75ed5213" containerName="oc" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.392960 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.395940 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.400456 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pqfzr" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.400643 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.411014 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd"] Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.419101 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-c987b"] Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.420212 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8wqn\" (UniqueName: \"kubernetes.io/projected/c1e372af-e82c-4c9d-b29c-7428b5d7746f-kube-api-access-k8wqn\") pod \"cert-manager-cainjector-cf98fcc89-6jbwd\" (UID: \"c1e372af-e82c-4c9d-b29c-7428b5d7746f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.423623 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c987b" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.430632 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-dpkrk" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.434421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c987b"] Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.445983 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gtsvs"] Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.447670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.449620 4795 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bkq9q" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.453633 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gtsvs"] Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.521427 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8wqn\" (UniqueName: \"kubernetes.io/projected/c1e372af-e82c-4c9d-b29c-7428b5d7746f-kube-api-access-k8wqn\") pod \"cert-manager-cainjector-cf98fcc89-6jbwd\" (UID: \"c1e372af-e82c-4c9d-b29c-7428b5d7746f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.542632 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8wqn\" (UniqueName: \"kubernetes.io/projected/c1e372af-e82c-4c9d-b29c-7428b5d7746f-kube-api-access-k8wqn\") pod \"cert-manager-cainjector-cf98fcc89-6jbwd\" (UID: \"c1e372af-e82c-4c9d-b29c-7428b5d7746f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.623003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ltph\" (UniqueName: \"kubernetes.io/projected/cedffdc5-be80-4b91-836a-261b0388fabd-kube-api-access-2ltph\") pod \"cert-manager-858654f9db-c987b\" (UID: \"cedffdc5-be80-4b91-836a-261b0388fabd\") " pod="cert-manager/cert-manager-858654f9db-c987b" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.623398 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffl7\" (UniqueName: \"kubernetes.io/projected/f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59-kube-api-access-8ffl7\") pod \"cert-manager-webhook-687f57d79b-gtsvs\" (UID: \"f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.725002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.725796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ltph\" (UniqueName: \"kubernetes.io/projected/cedffdc5-be80-4b91-836a-261b0388fabd-kube-api-access-2ltph\") pod \"cert-manager-858654f9db-c987b\" (UID: \"cedffdc5-be80-4b91-836a-261b0388fabd\") " pod="cert-manager/cert-manager-858654f9db-c987b" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.725876 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffl7\" (UniqueName: \"kubernetes.io/projected/f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59-kube-api-access-8ffl7\") pod \"cert-manager-webhook-687f57d79b-gtsvs\" (UID: \"f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.742829 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ltph\" (UniqueName: \"kubernetes.io/projected/cedffdc5-be80-4b91-836a-261b0388fabd-kube-api-access-2ltph\") pod \"cert-manager-858654f9db-c987b\" (UID: \"cedffdc5-be80-4b91-836a-261b0388fabd\") " pod="cert-manager/cert-manager-858654f9db-c987b" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.748999 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c987b" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.755523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffl7\" (UniqueName: \"kubernetes.io/projected/f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59-kube-api-access-8ffl7\") pod \"cert-manager-webhook-687f57d79b-gtsvs\" (UID: \"f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59\") " pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:12 crc kubenswrapper[4795]: I0310 15:19:12.763295 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.056120 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-gtsvs"] Mar 10 15:19:13 crc kubenswrapper[4795]: W0310 15:19:13.059273 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9920cd8_80d1_4f31_ad9d_fbc4a4b01f59.slice/crio-611f6d75b01d2f7543c64ef98cf8b9c4e3949c756fe7fe955b96b756dbfd7e99 WatchSource:0}: Error finding container 611f6d75b01d2f7543c64ef98cf8b9c4e3949c756fe7fe955b96b756dbfd7e99: Status 404 returned error can't find the container with id 611f6d75b01d2f7543c64ef98cf8b9c4e3949c756fe7fe955b96b756dbfd7e99 Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.203521 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c987b"] Mar 10 15:19:13 crc kubenswrapper[4795]: W0310 15:19:13.204718 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcedffdc5_be80_4b91_836a_261b0388fabd.slice/crio-f2a1631b43afdcc42f595b62c86840689ff38869bd32f12138c1ac7053e819e0 WatchSource:0}: Error finding container f2a1631b43afdcc42f595b62c86840689ff38869bd32f12138c1ac7053e819e0: Status 404 returned error can't find the container with id f2a1631b43afdcc42f595b62c86840689ff38869bd32f12138c1ac7053e819e0 Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.219128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd"] Mar 10 15:19:13 crc kubenswrapper[4795]: W0310 15:19:13.220948 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1e372af_e82c_4c9d_b29c_7428b5d7746f.slice/crio-873fc30cd1c62296fbe0f9d19ebf1630b6eb803d113d5e452f4e3ea2460b7523 WatchSource:0}: Error finding container 873fc30cd1c62296fbe0f9d19ebf1630b6eb803d113d5e452f4e3ea2460b7523: Status 404 returned error can't find the container with id 873fc30cd1c62296fbe0f9d19ebf1630b6eb803d113d5e452f4e3ea2460b7523 Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.486448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c987b" event={"ID":"cedffdc5-be80-4b91-836a-261b0388fabd","Type":"ContainerStarted","Data":"f2a1631b43afdcc42f595b62c86840689ff38869bd32f12138c1ac7053e819e0"} Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.486486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" event={"ID":"f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59","Type":"ContainerStarted","Data":"611f6d75b01d2f7543c64ef98cf8b9c4e3949c756fe7fe955b96b756dbfd7e99"} Mar 10 15:19:13 crc kubenswrapper[4795]: I0310 15:19:13.486501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" event={"ID":"c1e372af-e82c-4c9d-b29c-7428b5d7746f","Type":"ContainerStarted","Data":"873fc30cd1c62296fbe0f9d19ebf1630b6eb803d113d5e452f4e3ea2460b7523"} Mar 10 15:19:16 crc kubenswrapper[4795]: I0310 15:19:16.498598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" event={"ID":"f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59","Type":"ContainerStarted","Data":"a23ef336cb150b98fabe15e7800a037f5985067a391eaed3ebebb9a90e46b8a3"} Mar 10 15:19:16 crc kubenswrapper[4795]: I0310 15:19:16.498899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:16 crc kubenswrapper[4795]: I0310 15:19:16.519261 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" podStartSLOduration=1.754982845 podStartE2EDuration="4.519245808s" podCreationTimestamp="2026-03-10 15:19:12 +0000 UTC" firstStartedPulling="2026-03-10 15:19:13.060961058 +0000 UTC m=+786.226701956" lastFinishedPulling="2026-03-10 15:19:15.825224001 +0000 UTC m=+788.990964919" observedRunningTime="2026-03-10 15:19:16.51688407 +0000 UTC m=+789.682624968" watchObservedRunningTime="2026-03-10 15:19:16.519245808 +0000 UTC m=+789.684986706" Mar 10 15:19:17 crc kubenswrapper[4795]: I0310 15:19:17.502786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" event={"ID":"c1e372af-e82c-4c9d-b29c-7428b5d7746f","Type":"ContainerStarted","Data":"ae95150da85984563ff1c0813148f4d3754bd7d1d393416594c9649c2bce2c9b"} Mar 10 15:19:17 crc kubenswrapper[4795]: I0310 15:19:17.505270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c987b" event={"ID":"cedffdc5-be80-4b91-836a-261b0388fabd","Type":"ContainerStarted","Data":"867d8610617d4428f6b78fa542e2500a37fd2810dd15fafc4d3ac2bfc8507991"} Mar 10 15:19:17 crc kubenswrapper[4795]: I0310 15:19:17.550168 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-6jbwd" podStartSLOduration=1.778933278 podStartE2EDuration="5.550151635s" podCreationTimestamp="2026-03-10 15:19:12 +0000 UTC" firstStartedPulling="2026-03-10 15:19:13.224006699 +0000 UTC m=+786.389747597" lastFinishedPulling="2026-03-10 15:19:16.995225016 +0000 UTC m=+790.160965954" observedRunningTime="2026-03-10 15:19:17.532608285 +0000 UTC m=+790.698349203" watchObservedRunningTime="2026-03-10 15:19:17.550151635 +0000 UTC m=+790.715892533" Mar 10 15:19:17 crc kubenswrapper[4795]: I0310 15:19:17.552684 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-c987b" podStartSLOduration=1.666197383 podStartE2EDuration="5.552676967s" podCreationTimestamp="2026-03-10 15:19:12 +0000 UTC" firstStartedPulling="2026-03-10 15:19:13.206684465 +0000 UTC m=+786.372425363" lastFinishedPulling="2026-03-10 15:19:17.093164009 +0000 UTC m=+790.258904947" observedRunningTime="2026-03-10 15:19:17.549678352 +0000 UTC m=+790.715419250" watchObservedRunningTime="2026-03-10 15:19:17.552676967 +0000 UTC m=+790.718417865" Mar 10 15:19:18 crc kubenswrapper[4795]: I0310 15:19:18.539642 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:19:18 crc kubenswrapper[4795]: I0310 15:19:18.539733 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.368904 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4q8gk"] Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.369640 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-controller" containerID="cri-o://c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370110 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" containerID="cri-o://8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370192 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="northd" containerID="cri-o://db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370224 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="sbdb" containerID="cri-o://2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370249 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-node" containerID="cri-o://312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370341 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.370302 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="nbdb" containerID="cri-o://ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.404722 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" containerID="cri-o://61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" gracePeriod=30 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.549819 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.550675 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/1.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.552985 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/0.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553439 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-controller/0.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553794 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" exitCode=143 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553818 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" exitCode=0 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553825 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" exitCode=0 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553825 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553834 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" exitCode=143 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553876 4795 scope.go:117] "RemoveContainer" containerID="05a8c58ac41af9cce0f3f53a2adbbdc7fafec194eada68fd31e6833c4613ae65" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553949 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.553968 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.555900 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/2.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.556387 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/1.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.556428 4795 generic.go:334] "Generic (PLEG): container finished" podID="589b366f-9132-43cc-8d7a-d401d396bf06" containerID="a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a" exitCode=2 Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.556450 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerDied","Data":"a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a"} Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.557551 4795 scope.go:117] "RemoveContainer" containerID="a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.557743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-v49r8_openshift-multus(589b366f-9132-43cc-8d7a-d401d396bf06)\"" pod="openshift-multus/multus-v49r8" podUID="589b366f-9132-43cc-8d7a-d401d396bf06" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.686238 4795 scope.go:117] "RemoveContainer" containerID="e0f21583c0bcf624ec21be8b5e644f1117dc8f42332c8e339f5b2e9a8dba08f6" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.711534 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.712139 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/1.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.717153 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-controller/0.log" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.718708 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.767901 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-gtsvs" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792553 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qxn8b"] Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.792878 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792900 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.792921 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="sbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792930 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="sbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.792944 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792954 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.792964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792973 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.792982 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="northd" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.792989 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="northd" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793001 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="nbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793008 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="nbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793021 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kubecfg-setup" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793028 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kubecfg-setup" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793038 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793047 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793055 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793063 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793091 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793104 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793119 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-node" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793128 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-node" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793142 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793153 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793273 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793288 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793299 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="northd" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793310 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793319 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-ovn-metrics" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793330 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="kube-rbac-proxy-node" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793340 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793350 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793362 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="sbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793372 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="nbdb" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793382 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793499 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793510 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: E0310 15:19:22.793534 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793542 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793676 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovn-acl-logging" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.793695 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerName="ovnkube-controller" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.795333 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861600 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861642 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861666 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861692 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861721 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861736 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861827 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861797 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log" (OuterVolumeSpecName: "node-log") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861886 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861909 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861941 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861976 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862007 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862033 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.861978 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862124 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r92zs\" (UniqueName: \"kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862105 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862162 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862196 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket" (OuterVolumeSpecName: "log-socket") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862671 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862222 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862971 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.862990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash\") pod \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\" (UID: \"89b0616b-9d8b-43a5-b8c7-d9cbb4669583\") " Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863381 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863405 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863499 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash" (OuterVolumeSpecName: "host-slash") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-node-log\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-script-lib\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863714 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-var-lib-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863743 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-env-overrides\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-slash\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-kubelet\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863801 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863822 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-log-socket\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863867 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-bin\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.863901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-netns\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864150 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8876699d-9c7b-48bc-86e3-562e27e61540-ovn-node-metrics-cert\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-config\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864205 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-ovn\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-etc-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-systemd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864307 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-netd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864343 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncb5x\" (UniqueName: \"kubernetes.io/projected/8876699d-9c7b-48bc-86e3-562e27e61540-kube-api-access-ncb5x\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864382 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-systemd-units\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864490 4795 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864504 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864517 4795 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864527 4795 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864538 4795 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-node-log\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864546 4795 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864555 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864563 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864572 4795 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864581 4795 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864590 4795 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-log-socket\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864600 4795 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864609 4795 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864618 4795 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864626 4795 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864636 4795 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.864644 4795 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-host-slash\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.866649 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs" (OuterVolumeSpecName: "kube-api-access-r92zs") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "kube-api-access-r92zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.867187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.873739 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "89b0616b-9d8b-43a5-b8c7-d9cbb4669583" (UID: "89b0616b-9d8b-43a5-b8c7-d9cbb4669583"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965659 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-netns\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965757 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8876699d-9c7b-48bc-86e3-562e27e61540-ovn-node-metrics-cert\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-config\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965788 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-netns\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965807 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-ovn\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-ovn\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965904 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-run-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965917 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-etc-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965943 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-systemd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-netd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.965988 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncb5x\" (UniqueName: \"kubernetes.io/projected/8876699d-9c7b-48bc-86e3-562e27e61540-kube-api-access-ncb5x\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-systemd-units\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966062 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-node-log\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-script-lib\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-var-lib-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-slash\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-env-overrides\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-kubelet\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966215 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-log-socket\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966289 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-bin\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966333 4795 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966349 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r92zs\" (UniqueName: \"kubernetes.io/projected/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-kube-api-access-r92zs\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966363 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/89b0616b-9d8b-43a5-b8c7-d9cbb4669583-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966366 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-node-log\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.966397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-bin\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-config\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967387 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-etc-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-systemd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967478 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-cni-netd\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-ovnkube-script-lib\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-var-lib-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-slash\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-run-openvswitch\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-systemd-units\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.967949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.968004 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-host-kubelet\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.968046 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8876699d-9c7b-48bc-86e3-562e27e61540-log-socket\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.968119 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8876699d-9c7b-48bc-86e3-562e27e61540-env-overrides\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:22 crc kubenswrapper[4795]: I0310 15:19:22.969793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8876699d-9c7b-48bc-86e3-562e27e61540-ovn-node-metrics-cert\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.003683 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncb5x\" (UniqueName: \"kubernetes.io/projected/8876699d-9c7b-48bc-86e3-562e27e61540-kube-api-access-ncb5x\") pod \"ovnkube-node-qxn8b\" (UID: \"8876699d-9c7b-48bc-86e3-562e27e61540\") " pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.116874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.565669 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/2.log" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.568060 4795 generic.go:334] "Generic (PLEG): container finished" podID="8876699d-9c7b-48bc-86e3-562e27e61540" containerID="d9fa4a284d2c1c86bc20b0f51c441277a9ca731351356bb7c881b0506f5ac6f9" exitCode=0 Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.568121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerDied","Data":"d9fa4a284d2c1c86bc20b0f51c441277a9ca731351356bb7c881b0506f5ac6f9"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.568175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"d31b63182a9459c4e5fb2802f0ad145027513a85ebbbb9cf876e78af6e147240"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.572693 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovnkube-controller/3.log" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.573520 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-acl-logging/1.log" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577096 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4q8gk_89b0616b-9d8b-43a5-b8c7-d9cbb4669583/ovn-controller/0.log" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577801 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" exitCode=0 Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577843 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" exitCode=0 Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577861 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" exitCode=0 Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577876 4795 generic.go:334] "Generic (PLEG): container finished" podID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" exitCode=0 Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577940 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577967 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577984 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.577997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.578010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4q8gk" event={"ID":"89b0616b-9d8b-43a5-b8c7-d9cbb4669583","Type":"ContainerDied","Data":"190bf2554acd11f746ecab58421da3e6bccd92617f9c2a94668b4caf99de24e0"} Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.578031 4795 scope.go:117] "RemoveContainer" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.617369 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.650121 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4q8gk"] Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.655667 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4q8gk"] Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.687509 4795 scope.go:117] "RemoveContainer" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.717718 4795 scope.go:117] "RemoveContainer" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.735239 4795 scope.go:117] "RemoveContainer" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.754459 4795 scope.go:117] "RemoveContainer" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.775375 4795 scope.go:117] "RemoveContainer" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.797617 4795 scope.go:117] "RemoveContainer" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.827123 4795 scope.go:117] "RemoveContainer" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.857046 4795 scope.go:117] "RemoveContainer" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.892962 4795 scope.go:117] "RemoveContainer" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.893550 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": container with ID starting with 61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b not found: ID does not exist" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.893589 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} err="failed to get container status \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": rpc error: code = NotFound desc = could not find container \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": container with ID starting with 61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.893618 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.894278 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": container with ID starting with 3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08 not found: ID does not exist" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.894345 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} err="failed to get container status \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": rpc error: code = NotFound desc = could not find container \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": container with ID starting with 3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.894403 4795 scope.go:117] "RemoveContainer" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.894931 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": container with ID starting with 8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f not found: ID does not exist" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.894984 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} err="failed to get container status \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": rpc error: code = NotFound desc = could not find container \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": container with ID starting with 8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.895015 4795 scope.go:117] "RemoveContainer" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.895476 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": container with ID starting with 2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc not found: ID does not exist" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.895543 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} err="failed to get container status \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": rpc error: code = NotFound desc = could not find container \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": container with ID starting with 2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.895577 4795 scope.go:117] "RemoveContainer" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.896134 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": container with ID starting with ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099 not found: ID does not exist" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.896181 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} err="failed to get container status \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": rpc error: code = NotFound desc = could not find container \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": container with ID starting with ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.896210 4795 scope.go:117] "RemoveContainer" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.896714 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": container with ID starting with db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced not found: ID does not exist" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.897054 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} err="failed to get container status \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": rpc error: code = NotFound desc = could not find container \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": container with ID starting with db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.897158 4795 scope.go:117] "RemoveContainer" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.897632 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": container with ID starting with 62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10 not found: ID does not exist" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.897699 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} err="failed to get container status \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": rpc error: code = NotFound desc = could not find container \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": container with ID starting with 62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.897730 4795 scope.go:117] "RemoveContainer" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.898207 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": container with ID starting with 312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2 not found: ID does not exist" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.898255 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} err="failed to get container status \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": rpc error: code = NotFound desc = could not find container \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": container with ID starting with 312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.898288 4795 scope.go:117] "RemoveContainer" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.900526 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": container with ID starting with c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821 not found: ID does not exist" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.900576 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} err="failed to get container status \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": rpc error: code = NotFound desc = could not find container \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": container with ID starting with c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.900613 4795 scope.go:117] "RemoveContainer" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: E0310 15:19:23.901113 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": container with ID starting with 2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9 not found: ID does not exist" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.901159 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9"} err="failed to get container status \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": rpc error: code = NotFound desc = could not find container \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": container with ID starting with 2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.901187 4795 scope.go:117] "RemoveContainer" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.901751 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} err="failed to get container status \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": rpc error: code = NotFound desc = could not find container \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": container with ID starting with 61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.901809 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.902274 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} err="failed to get container status \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": rpc error: code = NotFound desc = could not find container \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": container with ID starting with 3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.902315 4795 scope.go:117] "RemoveContainer" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.902693 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} err="failed to get container status \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": rpc error: code = NotFound desc = could not find container \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": container with ID starting with 8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.902731 4795 scope.go:117] "RemoveContainer" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} err="failed to get container status \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": rpc error: code = NotFound desc = could not find container \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": container with ID starting with 2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903187 4795 scope.go:117] "RemoveContainer" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903551 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} err="failed to get container status \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": rpc error: code = NotFound desc = could not find container \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": container with ID starting with ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903586 4795 scope.go:117] "RemoveContainer" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903950 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} err="failed to get container status \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": rpc error: code = NotFound desc = could not find container \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": container with ID starting with db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.903981 4795 scope.go:117] "RemoveContainer" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.904366 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} err="failed to get container status \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": rpc error: code = NotFound desc = could not find container \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": container with ID starting with 62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.904393 4795 scope.go:117] "RemoveContainer" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.904804 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} err="failed to get container status \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": rpc error: code = NotFound desc = could not find container \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": container with ID starting with 312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.904852 4795 scope.go:117] "RemoveContainer" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.905333 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} err="failed to get container status \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": rpc error: code = NotFound desc = could not find container \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": container with ID starting with c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.905372 4795 scope.go:117] "RemoveContainer" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.905738 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9"} err="failed to get container status \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": rpc error: code = NotFound desc = could not find container \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": container with ID starting with 2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.905776 4795 scope.go:117] "RemoveContainer" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.906168 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} err="failed to get container status \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": rpc error: code = NotFound desc = could not find container \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": container with ID starting with 61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.906203 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.906662 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} err="failed to get container status \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": rpc error: code = NotFound desc = could not find container \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": container with ID starting with 3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.906706 4795 scope.go:117] "RemoveContainer" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907105 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} err="failed to get container status \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": rpc error: code = NotFound desc = could not find container \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": container with ID starting with 8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907148 4795 scope.go:117] "RemoveContainer" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907519 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} err="failed to get container status \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": rpc error: code = NotFound desc = could not find container \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": container with ID starting with 2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907559 4795 scope.go:117] "RemoveContainer" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907940 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} err="failed to get container status \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": rpc error: code = NotFound desc = could not find container \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": container with ID starting with ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.907982 4795 scope.go:117] "RemoveContainer" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.908550 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} err="failed to get container status \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": rpc error: code = NotFound desc = could not find container \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": container with ID starting with db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.908592 4795 scope.go:117] "RemoveContainer" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.909001 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} err="failed to get container status \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": rpc error: code = NotFound desc = could not find container \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": container with ID starting with 62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.909042 4795 scope.go:117] "RemoveContainer" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.909591 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} err="failed to get container status \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": rpc error: code = NotFound desc = could not find container \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": container with ID starting with 312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.909653 4795 scope.go:117] "RemoveContainer" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.910139 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} err="failed to get container status \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": rpc error: code = NotFound desc = could not find container \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": container with ID starting with c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.910187 4795 scope.go:117] "RemoveContainer" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.910564 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9"} err="failed to get container status \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": rpc error: code = NotFound desc = could not find container \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": container with ID starting with 2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.910608 4795 scope.go:117] "RemoveContainer" containerID="61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911022 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b"} err="failed to get container status \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": rpc error: code = NotFound desc = could not find container \"61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b\": container with ID starting with 61c8f9461b948b82b327b25e54b693040634369adbb4f32e8c62fab268babf5b not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911057 4795 scope.go:117] "RemoveContainer" containerID="3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911483 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08"} err="failed to get container status \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": rpc error: code = NotFound desc = could not find container \"3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08\": container with ID starting with 3e02bcd8dad2b27c95984cd41a477eec2011dda81ef9ac94812c476d1b890d08 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911521 4795 scope.go:117] "RemoveContainer" containerID="8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911895 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f"} err="failed to get container status \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": rpc error: code = NotFound desc = could not find container \"8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f\": container with ID starting with 8cb22ac3dc20964dc1d984c0c3138b86e73f0e2651c32ee7ef82f84f91f0504f not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.911933 4795 scope.go:117] "RemoveContainer" containerID="2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.912334 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc"} err="failed to get container status \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": rpc error: code = NotFound desc = could not find container \"2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc\": container with ID starting with 2767e9820805261d075f5207c6d64569abf14f293f4ce74d9f2182edc6c87adc not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.912376 4795 scope.go:117] "RemoveContainer" containerID="ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.912761 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099"} err="failed to get container status \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": rpc error: code = NotFound desc = could not find container \"ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099\": container with ID starting with ca83d52c313f13e704404c42fa46398684234df9468a3f8a1e87a68cd5db6099 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.912798 4795 scope.go:117] "RemoveContainer" containerID="db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.913183 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced"} err="failed to get container status \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": rpc error: code = NotFound desc = could not find container \"db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced\": container with ID starting with db070494c78e27f2885bc525f3f2887952569bbe6dbefb8c98628f0f849e3ced not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.913232 4795 scope.go:117] "RemoveContainer" containerID="62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.913587 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10"} err="failed to get container status \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": rpc error: code = NotFound desc = could not find container \"62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10\": container with ID starting with 62493f03586940dbb9f1cf3c9425f21859cdf768635d730944cc07f1abdaeb10 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.913623 4795 scope.go:117] "RemoveContainer" containerID="312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.913980 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2"} err="failed to get container status \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": rpc error: code = NotFound desc = could not find container \"312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2\": container with ID starting with 312f8bd912e4e16bde4a17f8402afef46a1619bd25f3335b73d784c12bfff9c2 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.914016 4795 scope.go:117] "RemoveContainer" containerID="c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.914412 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821"} err="failed to get container status \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": rpc error: code = NotFound desc = could not find container \"c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821\": container with ID starting with c787b51bf386394a5e0d867e8b9561e431e7663b51563b17b4695c7c48a43821 not found: ID does not exist" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.914445 4795 scope.go:117] "RemoveContainer" containerID="2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9" Mar 10 15:19:23 crc kubenswrapper[4795]: I0310 15:19:23.914801 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9"} err="failed to get container status \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": rpc error: code = NotFound desc = could not find container \"2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9\": container with ID starting with 2cb54d111e73e3904d4f703ab024d2709a0bd17595cc957adf79055e297a2af9 not found: ID does not exist" Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"2e5e30e16588cd88e067220690bf95c0d90cdc976724e1fe01e6d63f3de23f0f"} Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"13988f9eea18518871a0cf4ecb9197a091c6d920dd90f50c9f4e7e924261c067"} Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586511 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"7b5a6f598439df9d5ebd35663f6acb268f1196ac631f10d47bed63fa255b7ec1"} Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"96475cee2f4e3e2b44d5e20d051e3b82fd4973f0fca780b0fddad0d21df736ee"} Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586543 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"5a5a92c714577ca180d0bcab4d5ffb0e124abab31813dc416f84a3e83f9192f9"} Mar 10 15:19:24 crc kubenswrapper[4795]: I0310 15:19:24.586558 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"36f102d4cc414bc74f8012d3d4302367e5861ba3afb60648b0acf11944e27220"} Mar 10 15:19:25 crc kubenswrapper[4795]: I0310 15:19:25.493775 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b0616b-9d8b-43a5-b8c7-d9cbb4669583" path="/var/lib/kubelet/pods/89b0616b-9d8b-43a5-b8c7-d9cbb4669583/volumes" Mar 10 15:19:27 crc kubenswrapper[4795]: I0310 15:19:27.615497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"a6717e744cbb0bd2925e73c39d7c0e6d8170260901d6f5fd7d8a153a28e6e4a1"} Mar 10 15:19:29 crc kubenswrapper[4795]: I0310 15:19:29.630873 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" event={"ID":"8876699d-9c7b-48bc-86e3-562e27e61540","Type":"ContainerStarted","Data":"be56cf5f293c203c32c9adef014a419448872a92d1d1047ff79ca2e9f4900602"} Mar 10 15:19:29 crc kubenswrapper[4795]: I0310 15:19:29.631477 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:29 crc kubenswrapper[4795]: I0310 15:19:29.631492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:29 crc kubenswrapper[4795]: I0310 15:19:29.659697 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:29 crc kubenswrapper[4795]: I0310 15:19:29.661001 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" podStartSLOduration=7.660978577 podStartE2EDuration="7.660978577s" podCreationTimestamp="2026-03-10 15:19:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:19:29.656274623 +0000 UTC m=+802.822015551" watchObservedRunningTime="2026-03-10 15:19:29.660978577 +0000 UTC m=+802.826719505" Mar 10 15:19:30 crc kubenswrapper[4795]: I0310 15:19:30.637918 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:30 crc kubenswrapper[4795]: I0310 15:19:30.682111 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:35 crc kubenswrapper[4795]: I0310 15:19:35.476399 4795 scope.go:117] "RemoveContainer" containerID="a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a" Mar 10 15:19:35 crc kubenswrapper[4795]: E0310 15:19:35.477499 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-v49r8_openshift-multus(589b366f-9132-43cc-8d7a-d401d396bf06)\"" pod="openshift-multus/multus-v49r8" podUID="589b366f-9132-43cc-8d7a-d401d396bf06" Mar 10 15:19:48 crc kubenswrapper[4795]: I0310 15:19:48.539465 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:19:48 crc kubenswrapper[4795]: I0310 15:19:48.540246 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:19:50 crc kubenswrapper[4795]: I0310 15:19:50.476866 4795 scope.go:117] "RemoveContainer" containerID="a32e09e6dbae777fd5c7993935247e382578d35bdb2f96fa0cb4593dbf63686a" Mar 10 15:19:50 crc kubenswrapper[4795]: I0310 15:19:50.780676 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v49r8_589b366f-9132-43cc-8d7a-d401d396bf06/kube-multus/2.log" Mar 10 15:19:50 crc kubenswrapper[4795]: I0310 15:19:50.781010 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v49r8" event={"ID":"589b366f-9132-43cc-8d7a-d401d396bf06","Type":"ContainerStarted","Data":"a2f417e8b9e483cc7cc94598ac8807666fb17d997fa6714620940b65ce5841a0"} Mar 10 15:19:53 crc kubenswrapper[4795]: I0310 15:19:53.145982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qxn8b" Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.841575 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf"] Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.843657 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.846334 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.847056 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf"] Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.911153 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.911217 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:19:59 crc kubenswrapper[4795]: I0310 15:19:59.911262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp648\" (UniqueName: \"kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.012926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.013024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp648\" (UniqueName: \"kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.013138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.013545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.013820 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.042287 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp648\" (UniqueName: \"kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.157793 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552600-mcxq8"] Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.159392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.162278 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.164999 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-mcxq8"] Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.165950 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.166118 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.166373 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.215680 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kvq\" (UniqueName: \"kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq\") pod \"auto-csr-approver-29552600-mcxq8\" (UID: \"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e\") " pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.317199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kvq\" (UniqueName: \"kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq\") pod \"auto-csr-approver-29552600-mcxq8\" (UID: \"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e\") " pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.366247 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kvq\" (UniqueName: \"kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq\") pod \"auto-csr-approver-29552600-mcxq8\" (UID: \"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e\") " pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.434127 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf"] Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.497574 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.847531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerStarted","Data":"2368f9d715842b0fb294196c752c206c11de21e4fac21137781cdf77870f7201"} Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.847813 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerStarted","Data":"6daf4880c1f16b7c5ac23af3e2e379c0209b0ee7180e2d150c68f968d2dae5c8"} Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.882761 4795 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 10 15:20:00 crc kubenswrapper[4795]: I0310 15:20:00.966304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-mcxq8"] Mar 10 15:20:01 crc kubenswrapper[4795]: I0310 15:20:01.861193 4795 generic.go:334] "Generic (PLEG): container finished" podID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerID="2368f9d715842b0fb294196c752c206c11de21e4fac21137781cdf77870f7201" exitCode=0 Mar 10 15:20:01 crc kubenswrapper[4795]: I0310 15:20:01.861308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerDied","Data":"2368f9d715842b0fb294196c752c206c11de21e4fac21137781cdf77870f7201"} Mar 10 15:20:01 crc kubenswrapper[4795]: I0310 15:20:01.864041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" event={"ID":"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e","Type":"ContainerStarted","Data":"a98a52f7f65ae65cc2d1605936ba8c6f08f817cc1b7c9a3f7febaa26f465d130"} Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.210461 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.222624 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.237696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.243972 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdkh7\" (UniqueName: \"kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.244053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.244119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.344875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.345012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.345058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdkh7\" (UniqueName: \"kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.345512 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.345599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.363843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdkh7\" (UniqueName: \"kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7\") pod \"redhat-operators-zk92c\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.551746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:02 crc kubenswrapper[4795]: I0310 15:20:02.963219 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:02 crc kubenswrapper[4795]: W0310 15:20:02.975319 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod851bdec3_8512_4770_bb00_fc6925170305.slice/crio-7a2d06799b542824037cb1fe5214a2bd2782e0bb3423533b6422cb2ee0a64865 WatchSource:0}: Error finding container 7a2d06799b542824037cb1fe5214a2bd2782e0bb3423533b6422cb2ee0a64865: Status 404 returned error can't find the container with id 7a2d06799b542824037cb1fe5214a2bd2782e0bb3423533b6422cb2ee0a64865 Mar 10 15:20:03 crc kubenswrapper[4795]: I0310 15:20:03.874928 4795 generic.go:334] "Generic (PLEG): container finished" podID="73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" containerID="3d31e4d5fbe6c4068da3104fbf96a6a0c42498e9bc8b835bbc0f00b31d232baf" exitCode=0 Mar 10 15:20:03 crc kubenswrapper[4795]: I0310 15:20:03.875020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" event={"ID":"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e","Type":"ContainerDied","Data":"3d31e4d5fbe6c4068da3104fbf96a6a0c42498e9bc8b835bbc0f00b31d232baf"} Mar 10 15:20:03 crc kubenswrapper[4795]: I0310 15:20:03.876675 4795 generic.go:334] "Generic (PLEG): container finished" podID="851bdec3-8512-4770-bb00-fc6925170305" containerID="b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9" exitCode=0 Mar 10 15:20:03 crc kubenswrapper[4795]: I0310 15:20:03.876708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerDied","Data":"b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9"} Mar 10 15:20:03 crc kubenswrapper[4795]: I0310 15:20:03.876730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerStarted","Data":"7a2d06799b542824037cb1fe5214a2bd2782e0bb3423533b6422cb2ee0a64865"} Mar 10 15:20:04 crc kubenswrapper[4795]: I0310 15:20:04.888448 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerDied","Data":"1c2f5051c3d67f3a4c19ca8221b415f837488a9901339d4ed50155657f209e06"} Mar 10 15:20:04 crc kubenswrapper[4795]: I0310 15:20:04.888290 4795 generic.go:334] "Generic (PLEG): container finished" podID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerID="1c2f5051c3d67f3a4c19ca8221b415f837488a9901339d4ed50155657f209e06" exitCode=0 Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.157717 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.186491 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8kvq\" (UniqueName: \"kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq\") pod \"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e\" (UID: \"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e\") " Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.194320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq" (OuterVolumeSpecName: "kube-api-access-b8kvq") pod "73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" (UID: "73528fe5-4e4b-4ba5-b30e-e781e1f8d12e"). InnerVolumeSpecName "kube-api-access-b8kvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.287741 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8kvq\" (UniqueName: \"kubernetes.io/projected/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e-kube-api-access-b8kvq\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.898038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" event={"ID":"73528fe5-4e4b-4ba5-b30e-e781e1f8d12e","Type":"ContainerDied","Data":"a98a52f7f65ae65cc2d1605936ba8c6f08f817cc1b7c9a3f7febaa26f465d130"} Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.898436 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a98a52f7f65ae65cc2d1605936ba8c6f08f817cc1b7c9a3f7febaa26f465d130" Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.898127 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552600-mcxq8" Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.901029 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerStarted","Data":"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe"} Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.906373 4795 generic.go:334] "Generic (PLEG): container finished" podID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerID="7dc2538c387b39505673d008839d0425c4c6d99887d6418fdf84749835eb5584" exitCode=0 Mar 10 15:20:05 crc kubenswrapper[4795]: I0310 15:20:05.906429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerDied","Data":"7dc2538c387b39505673d008839d0425c4c6d99887d6418fdf84749835eb5584"} Mar 10 15:20:06 crc kubenswrapper[4795]: I0310 15:20:06.224851 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-c8s55"] Mar 10 15:20:06 crc kubenswrapper[4795]: I0310 15:20:06.231483 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552594-c8s55"] Mar 10 15:20:06 crc kubenswrapper[4795]: I0310 15:20:06.917262 4795 generic.go:334] "Generic (PLEG): container finished" podID="851bdec3-8512-4770-bb00-fc6925170305" containerID="200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe" exitCode=0 Mar 10 15:20:06 crc kubenswrapper[4795]: I0310 15:20:06.917379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerDied","Data":"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe"} Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.168125 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.313488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util\") pod \"debdf9b8-7f1a-4d6a-be68-78d832d39089\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.313563 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp648\" (UniqueName: \"kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648\") pod \"debdf9b8-7f1a-4d6a-be68-78d832d39089\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.313596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle\") pod \"debdf9b8-7f1a-4d6a-be68-78d832d39089\" (UID: \"debdf9b8-7f1a-4d6a-be68-78d832d39089\") " Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.315149 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle" (OuterVolumeSpecName: "bundle") pod "debdf9b8-7f1a-4d6a-be68-78d832d39089" (UID: "debdf9b8-7f1a-4d6a-be68-78d832d39089"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.319247 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648" (OuterVolumeSpecName: "kube-api-access-zp648") pod "debdf9b8-7f1a-4d6a-be68-78d832d39089" (UID: "debdf9b8-7f1a-4d6a-be68-78d832d39089"). InnerVolumeSpecName "kube-api-access-zp648". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.334370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util" (OuterVolumeSpecName: "util") pod "debdf9b8-7f1a-4d6a-be68-78d832d39089" (UID: "debdf9b8-7f1a-4d6a-be68-78d832d39089"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.414617 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.414646 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp648\" (UniqueName: \"kubernetes.io/projected/debdf9b8-7f1a-4d6a-be68-78d832d39089-kube-api-access-zp648\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.414657 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/debdf9b8-7f1a-4d6a-be68-78d832d39089-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.485233 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01f61473-d3ef-4f22-8f44-abb0b66b8a77" path="/var/lib/kubelet/pods/01f61473-d3ef-4f22-8f44-abb0b66b8a77/volumes" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.927527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerStarted","Data":"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8"} Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.933061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" event={"ID":"debdf9b8-7f1a-4d6a-be68-78d832d39089","Type":"ContainerDied","Data":"6daf4880c1f16b7c5ac23af3e2e379c0209b0ee7180e2d150c68f968d2dae5c8"} Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.933125 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6daf4880c1f16b7c5ac23af3e2e379c0209b0ee7180e2d150c68f968d2dae5c8" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.933151 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf" Mar 10 15:20:07 crc kubenswrapper[4795]: I0310 15:20:07.964602 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zk92c" podStartSLOduration=2.468568078 podStartE2EDuration="5.964572045s" podCreationTimestamp="2026-03-10 15:20:02 +0000 UTC" firstStartedPulling="2026-03-10 15:20:03.878310691 +0000 UTC m=+837.044051589" lastFinishedPulling="2026-03-10 15:20:07.374314648 +0000 UTC m=+840.540055556" observedRunningTime="2026-03-10 15:20:07.955061644 +0000 UTC m=+841.120802572" watchObservedRunningTime="2026-03-10 15:20:07.964572045 +0000 UTC m=+841.130312983" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.357894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9"] Mar 10 15:20:10 crc kubenswrapper[4795]: E0310 15:20:10.358316 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="pull" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358327 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="pull" Mar 10 15:20:10 crc kubenswrapper[4795]: E0310 15:20:10.358335 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="extract" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358340 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="extract" Mar 10 15:20:10 crc kubenswrapper[4795]: E0310 15:20:10.358354 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="util" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="util" Mar 10 15:20:10 crc kubenswrapper[4795]: E0310 15:20:10.358371 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" containerName="oc" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358377 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" containerName="oc" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358471 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" containerName="oc" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358488 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="debdf9b8-7f1a-4d6a-be68-78d832d39089" containerName="extract" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.358845 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.360745 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.360918 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-s9qfq" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.361319 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.373981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9"] Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.382523 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2bmr\" (UniqueName: \"kubernetes.io/projected/f1ae1d31-73b1-4619-91f2-39b6f1a8ad62-kube-api-access-k2bmr\") pod \"nmstate-operator-75c5dccd6c-h2mf9\" (UID: \"f1ae1d31-73b1-4619-91f2-39b6f1a8ad62\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.483852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2bmr\" (UniqueName: \"kubernetes.io/projected/f1ae1d31-73b1-4619-91f2-39b6f1a8ad62-kube-api-access-k2bmr\") pod \"nmstate-operator-75c5dccd6c-h2mf9\" (UID: \"f1ae1d31-73b1-4619-91f2-39b6f1a8ad62\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.499372 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2bmr\" (UniqueName: \"kubernetes.io/projected/f1ae1d31-73b1-4619-91f2-39b6f1a8ad62-kube-api-access-k2bmr\") pod \"nmstate-operator-75c5dccd6c-h2mf9\" (UID: \"f1ae1d31-73b1-4619-91f2-39b6f1a8ad62\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.673488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.878048 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9"] Mar 10 15:20:10 crc kubenswrapper[4795]: W0310 15:20:10.884465 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1ae1d31_73b1_4619_91f2_39b6f1a8ad62.slice/crio-017ed174cf9ce01c8b80dfdd1e255cdea37fd3e4c7ec46f2697ce12cee0e1ad0 WatchSource:0}: Error finding container 017ed174cf9ce01c8b80dfdd1e255cdea37fd3e4c7ec46f2697ce12cee0e1ad0: Status 404 returned error can't find the container with id 017ed174cf9ce01c8b80dfdd1e255cdea37fd3e4c7ec46f2697ce12cee0e1ad0 Mar 10 15:20:10 crc kubenswrapper[4795]: I0310 15:20:10.956031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" event={"ID":"f1ae1d31-73b1-4619-91f2-39b6f1a8ad62","Type":"ContainerStarted","Data":"017ed174cf9ce01c8b80dfdd1e255cdea37fd3e4c7ec46f2697ce12cee0e1ad0"} Mar 10 15:20:12 crc kubenswrapper[4795]: I0310 15:20:12.552898 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:12 crc kubenswrapper[4795]: I0310 15:20:12.553226 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:13 crc kubenswrapper[4795]: I0310 15:20:13.602131 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zk92c" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="registry-server" probeResult="failure" output=< Mar 10 15:20:13 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:20:13 crc kubenswrapper[4795]: > Mar 10 15:20:14 crc kubenswrapper[4795]: I0310 15:20:14.983911 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" event={"ID":"f1ae1d31-73b1-4619-91f2-39b6f1a8ad62","Type":"ContainerStarted","Data":"6e3d9573d7f8bc5792e64ac2c4568bc7f4f92e172b70ee5895eb7ae9cc6ceb0d"} Mar 10 15:20:15 crc kubenswrapper[4795]: I0310 15:20:15.008371 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-h2mf9" podStartSLOduration=1.959079491 podStartE2EDuration="5.008349953s" podCreationTimestamp="2026-03-10 15:20:10 +0000 UTC" firstStartedPulling="2026-03-10 15:20:10.886426533 +0000 UTC m=+844.052167431" lastFinishedPulling="2026-03-10 15:20:13.935696995 +0000 UTC m=+847.101437893" observedRunningTime="2026-03-10 15:20:15.004895504 +0000 UTC m=+848.170636412" watchObservedRunningTime="2026-03-10 15:20:15.008349953 +0000 UTC m=+848.174090851" Mar 10 15:20:18 crc kubenswrapper[4795]: I0310 15:20:18.538860 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:20:18 crc kubenswrapper[4795]: I0310 15:20:18.539270 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:20:18 crc kubenswrapper[4795]: I0310 15:20:18.539338 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:20:18 crc kubenswrapper[4795]: I0310 15:20:18.540232 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:20:18 crc kubenswrapper[4795]: I0310 15:20:18.540344 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f" gracePeriod=600 Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.009428 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f" exitCode=0 Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.009482 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f"} Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.009830 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d"} Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.009853 4795 scope.go:117] "RemoveContainer" containerID="8436cacf1b5dbb1ecf156aa5041298791f50d7318b7f087048ec77945fbd7697" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.958566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cj65x"] Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.959976 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.963502 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-h59p8" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.972255 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k85t4"] Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.973169 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.974957 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.980280 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cj65x"] Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.989133 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-wqm26"] Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.989975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:19 crc kubenswrapper[4795]: I0310 15:20:19.996402 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k85t4"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.014646 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7ctf\" (UniqueName: \"kubernetes.io/projected/f7da0eaa-841e-4750-9018-2264cc0142ff-kube-api-access-b7ctf\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.014803 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-nmstate-lock\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.015029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-dbus-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.015237 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjv5d\" (UniqueName: \"kubernetes.io/projected/d4285f38-a8d4-4521-9f80-27346b23640e-kube-api-access-hjv5d\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.015345 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-ovs-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.015405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.015433 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4p4t\" (UniqueName: \"kubernetes.io/projected/30c516e5-88f8-4da7-a095-d56867635a94-kube-api-access-l4p4t\") pod \"nmstate-metrics-69594cc75-cj65x\" (UID: \"30c516e5-88f8-4da7-a095-d56867635a94\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.098124 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.098975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.100391 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.100473 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tbhbz" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.105908 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.110576 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.115985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv54l\" (UniqueName: \"kubernetes.io/projected/73454f9e-adb1-4874-a046-8a74850d4667-kube-api-access-qv54l\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116050 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7ctf\" (UniqueName: \"kubernetes.io/projected/f7da0eaa-841e-4750-9018-2264cc0142ff-kube-api-access-b7ctf\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-nmstate-lock\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116193 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73454f9e-adb1-4874-a046-8a74850d4667-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116377 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-nmstate-lock\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116394 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-dbus-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjv5d\" (UniqueName: \"kubernetes.io/projected/d4285f38-a8d4-4521-9f80-27346b23640e-kube-api-access-hjv5d\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-dbus-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116657 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-ovs-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116684 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d4285f38-a8d4-4521-9f80-27346b23640e-ovs-socket\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4p4t\" (UniqueName: \"kubernetes.io/projected/30c516e5-88f8-4da7-a095-d56867635a94-kube-api-access-l4p4t\") pod \"nmstate-metrics-69594cc75-cj65x\" (UID: \"30c516e5-88f8-4da7-a095-d56867635a94\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.116769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: E0310 15:20:20.116857 4795 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 10 15:20:20 crc kubenswrapper[4795]: E0310 15:20:20.116906 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair podName:f7da0eaa-841e-4750-9018-2264cc0142ff nodeName:}" failed. No retries permitted until 2026-03-10 15:20:20.616888179 +0000 UTC m=+853.782629077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair") pod "nmstate-webhook-786f45cff4-k85t4" (UID: "f7da0eaa-841e-4750-9018-2264cc0142ff") : secret "openshift-nmstate-webhook" not found Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.139135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7ctf\" (UniqueName: \"kubernetes.io/projected/f7da0eaa-841e-4750-9018-2264cc0142ff-kube-api-access-b7ctf\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.139692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjv5d\" (UniqueName: \"kubernetes.io/projected/d4285f38-a8d4-4521-9f80-27346b23640e-kube-api-access-hjv5d\") pod \"nmstate-handler-wqm26\" (UID: \"d4285f38-a8d4-4521-9f80-27346b23640e\") " pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.159910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4p4t\" (UniqueName: \"kubernetes.io/projected/30c516e5-88f8-4da7-a095-d56867635a94-kube-api-access-l4p4t\") pod \"nmstate-metrics-69594cc75-cj65x\" (UID: \"30c516e5-88f8-4da7-a095-d56867635a94\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.217784 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.218509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv54l\" (UniqueName: \"kubernetes.io/projected/73454f9e-adb1-4874-a046-8a74850d4667-kube-api-access-qv54l\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.218614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73454f9e-adb1-4874-a046-8a74850d4667-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.219872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73454f9e-adb1-4874-a046-8a74850d4667-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: E0310 15:20:20.218300 4795 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 10 15:20:20 crc kubenswrapper[4795]: E0310 15:20:20.220115 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert podName:73454f9e-adb1-4874-a046-8a74850d4667 nodeName:}" failed. No retries permitted until 2026-03-10 15:20:20.720097463 +0000 UTC m=+853.885838361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert") pod "nmstate-console-plugin-5dcbbd79cf-5w9l5" (UID: "73454f9e-adb1-4874-a046-8a74850d4667") : secret "plugin-serving-cert" not found Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.243230 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv54l\" (UniqueName: \"kubernetes.io/projected/73454f9e-adb1-4874-a046-8a74850d4667-kube-api-access-qv54l\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.280272 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5bc65dbc8f-4r2kv"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.281042 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.281339 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.296800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc65dbc8f-4r2kv"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320437 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkm58\" (UniqueName: \"kubernetes.io/projected/35c8bd66-baf8-4c49-829e-63a59f0eec0f-kube-api-access-fkm58\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-trusted-ca-bundle\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320563 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-service-ca\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-oauth-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.320615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-oauth-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.342136 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:20 crc kubenswrapper[4795]: W0310 15:20:20.359669 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4285f38_a8d4_4521_9f80_27346b23640e.slice/crio-4bf73410823fc091c4d8be82e4e154ba3798b653f2d15ead922c87fbccec9bae WatchSource:0}: Error finding container 4bf73410823fc091c4d8be82e4e154ba3798b653f2d15ead922c87fbccec9bae: Status 404 returned error can't find the container with id 4bf73410823fc091c4d8be82e4e154ba3798b653f2d15ead922c87fbccec9bae Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.426926 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-oauth-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-oauth-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427430 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427455 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkm58\" (UniqueName: \"kubernetes.io/projected/35c8bd66-baf8-4c49-829e-63a59f0eec0f-kube-api-access-fkm58\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-trusted-ca-bundle\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427619 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.427693 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-service-ca\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.431587 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-oauth-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.431894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.433443 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-service-ca\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.433881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-oauth-config\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.434908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/35c8bd66-baf8-4c49-829e-63a59f0eec0f-console-serving-cert\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.434908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/35c8bd66-baf8-4c49-829e-63a59f0eec0f-trusted-ca-bundle\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.442810 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkm58\" (UniqueName: \"kubernetes.io/projected/35c8bd66-baf8-4c49-829e-63a59f0eec0f-kube-api-access-fkm58\") pod \"console-5bc65dbc8f-4r2kv\" (UID: \"35c8bd66-baf8-4c49-829e-63a59f0eec0f\") " pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.596682 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.630044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.635662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/f7da0eaa-841e-4750-9018-2264cc0142ff-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-k85t4\" (UID: \"f7da0eaa-841e-4750-9018-2264cc0142ff\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.686098 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-cj65x"] Mar 10 15:20:20 crc kubenswrapper[4795]: W0310 15:20:20.691051 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30c516e5_88f8_4da7_a095_d56867635a94.slice/crio-dd3a9901103c489af95fadadc26363bf2c0abe20a7157405362c664ca0baf470 WatchSource:0}: Error finding container dd3a9901103c489af95fadadc26363bf2c0abe20a7157405362c664ca0baf470: Status 404 returned error can't find the container with id dd3a9901103c489af95fadadc26363bf2c0abe20a7157405362c664ca0baf470 Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.731703 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.736701 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/73454f9e-adb1-4874-a046-8a74850d4667-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-5w9l5\" (UID: \"73454f9e-adb1-4874-a046-8a74850d4667\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.827644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5bc65dbc8f-4r2kv"] Mar 10 15:20:20 crc kubenswrapper[4795]: I0310 15:20:20.927434 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.023858 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.027374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc65dbc8f-4r2kv" event={"ID":"35c8bd66-baf8-4c49-829e-63a59f0eec0f","Type":"ContainerStarted","Data":"2ff0d33702e56e79e02f9c4d10211d57f1c571736eeb473ac15a551e0ddf5605"} Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.027407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5bc65dbc8f-4r2kv" event={"ID":"35c8bd66-baf8-4c49-829e-63a59f0eec0f","Type":"ContainerStarted","Data":"fe4a972c1af9b7567ce89d4f698a5fc3caffde471433ea595770c83cbf1956d8"} Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.031328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wqm26" event={"ID":"d4285f38-a8d4-4521-9f80-27346b23640e","Type":"ContainerStarted","Data":"4bf73410823fc091c4d8be82e4e154ba3798b653f2d15ead922c87fbccec9bae"} Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.034115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" event={"ID":"30c516e5-88f8-4da7-a095-d56867635a94","Type":"ContainerStarted","Data":"dd3a9901103c489af95fadadc26363bf2c0abe20a7157405362c664ca0baf470"} Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.046693 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5bc65dbc8f-4r2kv" podStartSLOduration=1.046675011 podStartE2EDuration="1.046675011s" podCreationTimestamp="2026-03-10 15:20:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:20:21.045256041 +0000 UTC m=+854.210996939" watchObservedRunningTime="2026-03-10 15:20:21.046675011 +0000 UTC m=+854.212415919" Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.123609 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-k85t4"] Mar 10 15:20:21 crc kubenswrapper[4795]: I0310 15:20:21.239002 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5"] Mar 10 15:20:21 crc kubenswrapper[4795]: W0310 15:20:21.246341 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73454f9e_adb1_4874_a046_8a74850d4667.slice/crio-c044a38d2570e10c4dc2dad47b9783b29bebf2514fa9e466727fba85e20929e2 WatchSource:0}: Error finding container c044a38d2570e10c4dc2dad47b9783b29bebf2514fa9e466727fba85e20929e2: Status 404 returned error can't find the container with id c044a38d2570e10c4dc2dad47b9783b29bebf2514fa9e466727fba85e20929e2 Mar 10 15:20:22 crc kubenswrapper[4795]: I0310 15:20:22.051003 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" event={"ID":"f7da0eaa-841e-4750-9018-2264cc0142ff","Type":"ContainerStarted","Data":"91ed49be428e83755cf888134ab79be0b2fe195dda5dcaa5987c9f06f0196a15"} Mar 10 15:20:22 crc kubenswrapper[4795]: I0310 15:20:22.054207 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" event={"ID":"73454f9e-adb1-4874-a046-8a74850d4667","Type":"ContainerStarted","Data":"c044a38d2570e10c4dc2dad47b9783b29bebf2514fa9e466727fba85e20929e2"} Mar 10 15:20:22 crc kubenswrapper[4795]: I0310 15:20:22.596339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:22 crc kubenswrapper[4795]: I0310 15:20:22.668338 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:22 crc kubenswrapper[4795]: I0310 15:20:22.833048 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.075027 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" event={"ID":"f7da0eaa-841e-4750-9018-2264cc0142ff","Type":"ContainerStarted","Data":"6f32266d40f212266659ceea0deb4adfc30fff350609c38073378f282f269137"} Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.075360 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.077949 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-wqm26" event={"ID":"d4285f38-a8d4-4521-9f80-27346b23640e","Type":"ContainerStarted","Data":"064ffe1bb634d45c81748c3077d22dc6e8c342b19efb5d4b2213517e15307af6"} Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.078032 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.079472 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zk92c" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="registry-server" containerID="cri-o://f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8" gracePeriod=2 Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.079724 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" event={"ID":"30c516e5-88f8-4da7-a095-d56867635a94","Type":"ContainerStarted","Data":"fc05f733c1ba6e37b759177dd7ec4d907bd3c2cc5ffb9990d2c92cb022d4bd96"} Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.093847 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" podStartSLOduration=2.756883359 podStartE2EDuration="5.093826424s" podCreationTimestamp="2026-03-10 15:20:19 +0000 UTC" firstStartedPulling="2026-03-10 15:20:21.115028651 +0000 UTC m=+854.280769549" lastFinishedPulling="2026-03-10 15:20:23.451971706 +0000 UTC m=+856.617712614" observedRunningTime="2026-03-10 15:20:24.089713107 +0000 UTC m=+857.255453995" watchObservedRunningTime="2026-03-10 15:20:24.093826424 +0000 UTC m=+857.259567322" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.105672 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-wqm26" podStartSLOduration=1.9575380070000001 podStartE2EDuration="5.105656792s" podCreationTimestamp="2026-03-10 15:20:19 +0000 UTC" firstStartedPulling="2026-03-10 15:20:20.365454168 +0000 UTC m=+853.531195066" lastFinishedPulling="2026-03-10 15:20:23.513572943 +0000 UTC m=+856.679313851" observedRunningTime="2026-03-10 15:20:24.104232021 +0000 UTC m=+857.269972929" watchObservedRunningTime="2026-03-10 15:20:24.105656792 +0000 UTC m=+857.271397710" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.790886 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.891200 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdkh7\" (UniqueName: \"kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7\") pod \"851bdec3-8512-4770-bb00-fc6925170305\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.891291 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content\") pod \"851bdec3-8512-4770-bb00-fc6925170305\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.891342 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities\") pod \"851bdec3-8512-4770-bb00-fc6925170305\" (UID: \"851bdec3-8512-4770-bb00-fc6925170305\") " Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.892229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities" (OuterVolumeSpecName: "utilities") pod "851bdec3-8512-4770-bb00-fc6925170305" (UID: "851bdec3-8512-4770-bb00-fc6925170305"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.894796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7" (OuterVolumeSpecName: "kube-api-access-bdkh7") pod "851bdec3-8512-4770-bb00-fc6925170305" (UID: "851bdec3-8512-4770-bb00-fc6925170305"). InnerVolumeSpecName "kube-api-access-bdkh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.993935 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:24 crc kubenswrapper[4795]: I0310 15:20:24.993990 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdkh7\" (UniqueName: \"kubernetes.io/projected/851bdec3-8512-4770-bb00-fc6925170305-kube-api-access-bdkh7\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.068174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "851bdec3-8512-4770-bb00-fc6925170305" (UID: "851bdec3-8512-4770-bb00-fc6925170305"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.089269 4795 generic.go:334] "Generic (PLEG): container finished" podID="851bdec3-8512-4770-bb00-fc6925170305" containerID="f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8" exitCode=0 Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.089342 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zk92c" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.089372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerDied","Data":"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8"} Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.089404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zk92c" event={"ID":"851bdec3-8512-4770-bb00-fc6925170305","Type":"ContainerDied","Data":"7a2d06799b542824037cb1fe5214a2bd2782e0bb3423533b6422cb2ee0a64865"} Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.089426 4795 scope.go:117] "RemoveContainer" containerID="f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.094864 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/851bdec3-8512-4770-bb00-fc6925170305-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.094916 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" event={"ID":"73454f9e-adb1-4874-a046-8a74850d4667","Type":"ContainerStarted","Data":"bb8d85d98273a9f073f05b4ea2f34cce23864d123b35871297b91220ef02e120"} Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.121138 4795 scope.go:117] "RemoveContainer" containerID="200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.129187 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-5w9l5" podStartSLOduration=1.759797144 podStartE2EDuration="5.129162848s" podCreationTimestamp="2026-03-10 15:20:20 +0000 UTC" firstStartedPulling="2026-03-10 15:20:21.248239701 +0000 UTC m=+854.413980609" lastFinishedPulling="2026-03-10 15:20:24.617605415 +0000 UTC m=+857.783346313" observedRunningTime="2026-03-10 15:20:25.117281889 +0000 UTC m=+858.283022817" watchObservedRunningTime="2026-03-10 15:20:25.129162848 +0000 UTC m=+858.294903756" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.139755 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.145672 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zk92c"] Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.151656 4795 scope.go:117] "RemoveContainer" containerID="b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.168099 4795 scope.go:117] "RemoveContainer" containerID="f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8" Mar 10 15:20:25 crc kubenswrapper[4795]: E0310 15:20:25.169280 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8\": container with ID starting with f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8 not found: ID does not exist" containerID="f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.169340 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8"} err="failed to get container status \"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8\": rpc error: code = NotFound desc = could not find container \"f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8\": container with ID starting with f9d98bb6b0b029034371afe0377f9a07012befa202e1d90331049eb1149998d8 not found: ID does not exist" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.169371 4795 scope.go:117] "RemoveContainer" containerID="200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe" Mar 10 15:20:25 crc kubenswrapper[4795]: E0310 15:20:25.169704 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe\": container with ID starting with 200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe not found: ID does not exist" containerID="200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.169730 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe"} err="failed to get container status \"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe\": rpc error: code = NotFound desc = could not find container \"200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe\": container with ID starting with 200720bf459f26c6ca2e890c3995a44eab9a5cff72e9f3397b3d9d2b3c5f57fe not found: ID does not exist" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.169748 4795 scope.go:117] "RemoveContainer" containerID="b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9" Mar 10 15:20:25 crc kubenswrapper[4795]: E0310 15:20:25.170112 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9\": container with ID starting with b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9 not found: ID does not exist" containerID="b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.170137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9"} err="failed to get container status \"b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9\": rpc error: code = NotFound desc = could not find container \"b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9\": container with ID starting with b6cc2e89d15cf7db02d3d5e337646ee54ba08ec626358eb0c367b3131a860bc9 not found: ID does not exist" Mar 10 15:20:25 crc kubenswrapper[4795]: I0310 15:20:25.483898 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="851bdec3-8512-4770-bb00-fc6925170305" path="/var/lib/kubelet/pods/851bdec3-8512-4770-bb00-fc6925170305/volumes" Mar 10 15:20:29 crc kubenswrapper[4795]: I0310 15:20:29.129210 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" event={"ID":"30c516e5-88f8-4da7-a095-d56867635a94","Type":"ContainerStarted","Data":"81d8f5afd449cad30c05612bfd644c6c27efdab7a30909206df82a114fff694b"} Mar 10 15:20:29 crc kubenswrapper[4795]: I0310 15:20:29.160641 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-cj65x" podStartSLOduration=2.745557457 podStartE2EDuration="10.160611058s" podCreationTimestamp="2026-03-10 15:20:19 +0000 UTC" firstStartedPulling="2026-03-10 15:20:20.693775764 +0000 UTC m=+853.859516662" lastFinishedPulling="2026-03-10 15:20:28.108829365 +0000 UTC m=+861.274570263" observedRunningTime="2026-03-10 15:20:29.155561194 +0000 UTC m=+862.321302172" watchObservedRunningTime="2026-03-10 15:20:29.160611058 +0000 UTC m=+862.326351996" Mar 10 15:20:30 crc kubenswrapper[4795]: I0310 15:20:30.478599 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-wqm26" Mar 10 15:20:30 crc kubenswrapper[4795]: I0310 15:20:30.597127 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:30 crc kubenswrapper[4795]: I0310 15:20:30.597163 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:30 crc kubenswrapper[4795]: I0310 15:20:30.601670 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:31 crc kubenswrapper[4795]: I0310 15:20:31.421957 4795 scope.go:117] "RemoveContainer" containerID="33ebbf35641f5d193e21a31217a65a594ef5a2f13ee57f6b29005868e3d0047e" Mar 10 15:20:31 crc kubenswrapper[4795]: I0310 15:20:31.470720 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5bc65dbc8f-4r2kv" Mar 10 15:20:31 crc kubenswrapper[4795]: I0310 15:20:31.534585 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:20:40 crc kubenswrapper[4795]: I0310 15:20:40.936414 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-k85t4" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.881817 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:43 crc kubenswrapper[4795]: E0310 15:20:43.882445 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="extract-content" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.882461 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="extract-content" Mar 10 15:20:43 crc kubenswrapper[4795]: E0310 15:20:43.882477 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="registry-server" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.882484 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="registry-server" Mar 10 15:20:43 crc kubenswrapper[4795]: E0310 15:20:43.882498 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="extract-utilities" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.882507 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="extract-utilities" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.882648 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="851bdec3-8512-4770-bb00-fc6925170305" containerName="registry-server" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.883720 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.898360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.994768 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.994839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:43 crc kubenswrapper[4795]: I0310 15:20:43.994936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nnnl\" (UniqueName: \"kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.096509 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.096580 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.096647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nnnl\" (UniqueName: \"kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.097022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.097538 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.125754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nnnl\" (UniqueName: \"kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl\") pod \"redhat-marketplace-xwfc8\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.216610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.430827 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:44 crc kubenswrapper[4795]: W0310 15:20:44.444802 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3befb36d_7acd_42db_96e0_433f5cf50f32.slice/crio-f2c654b5eb69615c4cf74a552b6a79402794a59eb087a257c9770067297132ac WatchSource:0}: Error finding container f2c654b5eb69615c4cf74a552b6a79402794a59eb087a257c9770067297132ac: Status 404 returned error can't find the container with id f2c654b5eb69615c4cf74a552b6a79402794a59eb087a257c9770067297132ac Mar 10 15:20:44 crc kubenswrapper[4795]: I0310 15:20:44.557092 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerStarted","Data":"f2c654b5eb69615c4cf74a552b6a79402794a59eb087a257c9770067297132ac"} Mar 10 15:20:45 crc kubenswrapper[4795]: I0310 15:20:45.563808 4795 generic.go:334] "Generic (PLEG): container finished" podID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerID="176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578" exitCode=0 Mar 10 15:20:45 crc kubenswrapper[4795]: I0310 15:20:45.564103 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerDied","Data":"176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578"} Mar 10 15:20:46 crc kubenswrapper[4795]: I0310 15:20:46.572683 4795 generic.go:334] "Generic (PLEG): container finished" podID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerID="bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0" exitCode=0 Mar 10 15:20:46 crc kubenswrapper[4795]: I0310 15:20:46.572880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerDied","Data":"bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0"} Mar 10 15:20:47 crc kubenswrapper[4795]: I0310 15:20:47.592668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerStarted","Data":"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76"} Mar 10 15:20:47 crc kubenswrapper[4795]: I0310 15:20:47.629524 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xwfc8" podStartSLOduration=3.035070126 podStartE2EDuration="4.629502838s" podCreationTimestamp="2026-03-10 15:20:43 +0000 UTC" firstStartedPulling="2026-03-10 15:20:45.56512363 +0000 UTC m=+878.730864528" lastFinishedPulling="2026-03-10 15:20:47.159556332 +0000 UTC m=+880.325297240" observedRunningTime="2026-03-10 15:20:47.622861498 +0000 UTC m=+880.788602446" watchObservedRunningTime="2026-03-10 15:20:47.629502838 +0000 UTC m=+880.795243746" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.267984 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.272499 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.283009 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.393742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.393820 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.393843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8jzc\" (UniqueName: \"kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.495317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.495379 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.495398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8jzc\" (UniqueName: \"kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.495854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.495994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.527186 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8jzc\" (UniqueName: \"kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc\") pod \"community-operators-l4xk5\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:50 crc kubenswrapper[4795]: I0310 15:20:50.627567 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:20:51 crc kubenswrapper[4795]: I0310 15:20:51.116235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:20:51 crc kubenswrapper[4795]: I0310 15:20:51.617355 4795 generic.go:334] "Generic (PLEG): container finished" podID="254c6166-15b9-4fd0-a181-bda892531c5b" containerID="2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216" exitCode=0 Mar 10 15:20:51 crc kubenswrapper[4795]: I0310 15:20:51.617476 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerDied","Data":"2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216"} Mar 10 15:20:51 crc kubenswrapper[4795]: I0310 15:20:51.618017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerStarted","Data":"c814f903e5d4946e69789c993e01e4dc9c23ec4c60a534b58d3a029069a03f61"} Mar 10 15:20:52 crc kubenswrapper[4795]: I0310 15:20:52.625623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerStarted","Data":"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78"} Mar 10 15:20:53 crc kubenswrapper[4795]: I0310 15:20:53.631652 4795 generic.go:334] "Generic (PLEG): container finished" podID="254c6166-15b9-4fd0-a181-bda892531c5b" containerID="91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78" exitCode=0 Mar 10 15:20:53 crc kubenswrapper[4795]: I0310 15:20:53.631742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerDied","Data":"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78"} Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.217802 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.218271 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.259197 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.642259 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerStarted","Data":"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18"} Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.666020 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l4xk5" podStartSLOduration=2.226875602 podStartE2EDuration="4.666001689s" podCreationTimestamp="2026-03-10 15:20:50 +0000 UTC" firstStartedPulling="2026-03-10 15:20:51.6200155 +0000 UTC m=+884.785756408" lastFinishedPulling="2026-03-10 15:20:54.059141577 +0000 UTC m=+887.224882495" observedRunningTime="2026-03-10 15:20:54.665440953 +0000 UTC m=+887.831181851" watchObservedRunningTime="2026-03-10 15:20:54.666001689 +0000 UTC m=+887.831742587" Mar 10 15:20:54 crc kubenswrapper[4795]: I0310 15:20:54.692272 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.687580 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm"] Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.689310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.691985 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.704236 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm"] Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.867683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprkc\" (UniqueName: \"kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.867750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.867771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.969338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.969471 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.970326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprkc\" (UniqueName: \"kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.970415 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.970524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:55 crc kubenswrapper[4795]: I0310 15:20:55.993780 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprkc\" (UniqueName: \"kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.015311 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.487459 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm"] Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.595562 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-g452c" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" containerID="cri-o://388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74" gracePeriod=15 Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.656764 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerStarted","Data":"d58589c85f02b1133863403c58af78d5c23fe1fae5cb4060cae701ec3a430b50"} Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.656809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerStarted","Data":"927a5157cb8d68da649bda04fca67725eb2a2eab029709cb875d075ea9d8bdf3"} Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.907576 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g452c_a8e49156-de09-480a-933b-6815cde0b311/console/0.log" Mar 10 15:20:56 crc kubenswrapper[4795]: I0310 15:20:56.907874 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.093186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.093377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.093428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.093927 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094024 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4m98\" (UniqueName: \"kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98\") pod \"a8e49156-de09-480a-933b-6815cde0b311\" (UID: \"a8e49156-de09-480a-933b-6815cde0b311\") " Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094673 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config" (OuterVolumeSpecName: "console-config") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca" (OuterVolumeSpecName: "service-ca") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.094860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.095609 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.095988 4795 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-console-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.096038 4795 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-service-ca\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.096057 4795 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.096100 4795 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e49156-de09-480a-933b-6815cde0b311-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.102453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98" (OuterVolumeSpecName: "kube-api-access-b4m98") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "kube-api-access-b4m98". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.102794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.105214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a8e49156-de09-480a-933b-6815cde0b311" (UID: "a8e49156-de09-480a-933b-6815cde0b311"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.197825 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4m98\" (UniqueName: \"kubernetes.io/projected/a8e49156-de09-480a-933b-6815cde0b311-kube-api-access-b4m98\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.197897 4795 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.197925 4795 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e49156-de09-480a-933b-6815cde0b311-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.642900 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.643280 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xwfc8" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="registry-server" containerID="cri-o://546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76" gracePeriod=2 Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.664254 4795 generic.go:334] "Generic (PLEG): container finished" podID="62a78361-debb-4191-af7d-24be60f6fe39" containerID="d58589c85f02b1133863403c58af78d5c23fe1fae5cb4060cae701ec3a430b50" exitCode=0 Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.664318 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerDied","Data":"d58589c85f02b1133863403c58af78d5c23fe1fae5cb4060cae701ec3a430b50"} Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667584 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-g452c_a8e49156-de09-480a-933b-6815cde0b311/console/0.log" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667620 4795 generic.go:334] "Generic (PLEG): container finished" podID="a8e49156-de09-480a-933b-6815cde0b311" containerID="388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74" exitCode=2 Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667643 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g452c" event={"ID":"a8e49156-de09-480a-933b-6815cde0b311","Type":"ContainerDied","Data":"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74"} Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-g452c" event={"ID":"a8e49156-de09-480a-933b-6815cde0b311","Type":"ContainerDied","Data":"610b29f6ba39af590af62b9626e010b61dd5c359b3890c33218ea0eb7f5527bb"} Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667680 4795 scope.go:117] "RemoveContainer" containerID="388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.667702 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-g452c" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.693541 4795 scope.go:117] "RemoveContainer" containerID="388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74" Mar 10 15:20:57 crc kubenswrapper[4795]: E0310 15:20:57.694124 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74\": container with ID starting with 388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74 not found: ID does not exist" containerID="388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.694181 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74"} err="failed to get container status \"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74\": rpc error: code = NotFound desc = could not find container \"388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74\": container with ID starting with 388ac8e39f5ee06979ad59320621912dad60b90b521121882a55c0d8681bad74 not found: ID does not exist" Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.707041 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:20:57 crc kubenswrapper[4795]: I0310 15:20:57.711009 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-g452c"] Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.196314 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.312044 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4nnnl\" (UniqueName: \"kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl\") pod \"3befb36d-7acd-42db-96e0-433f5cf50f32\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.312185 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content\") pod \"3befb36d-7acd-42db-96e0-433f5cf50f32\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.312385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities\") pod \"3befb36d-7acd-42db-96e0-433f5cf50f32\" (UID: \"3befb36d-7acd-42db-96e0-433f5cf50f32\") " Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.313819 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities" (OuterVolumeSpecName: "utilities") pod "3befb36d-7acd-42db-96e0-433f5cf50f32" (UID: "3befb36d-7acd-42db-96e0-433f5cf50f32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.318513 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl" (OuterVolumeSpecName: "kube-api-access-4nnnl") pod "3befb36d-7acd-42db-96e0-433f5cf50f32" (UID: "3befb36d-7acd-42db-96e0-433f5cf50f32"). InnerVolumeSpecName "kube-api-access-4nnnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.343562 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3befb36d-7acd-42db-96e0-433f5cf50f32" (UID: "3befb36d-7acd-42db-96e0-433f5cf50f32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.414386 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.414439 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4nnnl\" (UniqueName: \"kubernetes.io/projected/3befb36d-7acd-42db-96e0-433f5cf50f32-kube-api-access-4nnnl\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.414466 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3befb36d-7acd-42db-96e0-433f5cf50f32-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.679880 4795 generic.go:334] "Generic (PLEG): container finished" podID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerID="546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76" exitCode=0 Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.679983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xwfc8" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.680030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerDied","Data":"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76"} Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.680152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xwfc8" event={"ID":"3befb36d-7acd-42db-96e0-433f5cf50f32","Type":"ContainerDied","Data":"f2c654b5eb69615c4cf74a552b6a79402794a59eb087a257c9770067297132ac"} Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.680196 4795 scope.go:117] "RemoveContainer" containerID="546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.708030 4795 scope.go:117] "RemoveContainer" containerID="bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.733229 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.741795 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xwfc8"] Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.747584 4795 scope.go:117] "RemoveContainer" containerID="176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.772266 4795 scope.go:117] "RemoveContainer" containerID="546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76" Mar 10 15:20:58 crc kubenswrapper[4795]: E0310 15:20:58.773365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76\": container with ID starting with 546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76 not found: ID does not exist" containerID="546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.773403 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76"} err="failed to get container status \"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76\": rpc error: code = NotFound desc = could not find container \"546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76\": container with ID starting with 546504950d10c475f5d95efee8e222fa63bbb1693639e977bf5b8065bde4ef76 not found: ID does not exist" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.773433 4795 scope.go:117] "RemoveContainer" containerID="bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0" Mar 10 15:20:58 crc kubenswrapper[4795]: E0310 15:20:58.773754 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0\": container with ID starting with bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0 not found: ID does not exist" containerID="bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.773786 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0"} err="failed to get container status \"bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0\": rpc error: code = NotFound desc = could not find container \"bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0\": container with ID starting with bdc8c693625c83905cab9e16f88956a53d848c8197e231f8b7631c4e4e7a88e0 not found: ID does not exist" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.773805 4795 scope.go:117] "RemoveContainer" containerID="176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578" Mar 10 15:20:58 crc kubenswrapper[4795]: E0310 15:20:58.774087 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578\": container with ID starting with 176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578 not found: ID does not exist" containerID="176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578" Mar 10 15:20:58 crc kubenswrapper[4795]: I0310 15:20:58.774122 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578"} err="failed to get container status \"176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578\": rpc error: code = NotFound desc = could not find container \"176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578\": container with ID starting with 176d87c3f5449ad1b376b5a7232a1ba1005317fff1bbe4df3e5ef851de9b4578 not found: ID does not exist" Mar 10 15:20:59 crc kubenswrapper[4795]: I0310 15:20:59.485221 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" path="/var/lib/kubelet/pods/3befb36d-7acd-42db-96e0-433f5cf50f32/volumes" Mar 10 15:20:59 crc kubenswrapper[4795]: I0310 15:20:59.487100 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e49156-de09-480a-933b-6815cde0b311" path="/var/lib/kubelet/pods/a8e49156-de09-480a-933b-6815cde0b311/volumes" Mar 10 15:20:59 crc kubenswrapper[4795]: E0310 15:20:59.487941 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62a78361_debb_4191_af7d_24be60f6fe39.slice/crio-c412a71340f0e1e9f2202af92dd1c5c26b89e6ec7f8647633bda8da34a8bfeec.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:20:59 crc kubenswrapper[4795]: I0310 15:20:59.697859 4795 generic.go:334] "Generic (PLEG): container finished" podID="62a78361-debb-4191-af7d-24be60f6fe39" containerID="c412a71340f0e1e9f2202af92dd1c5c26b89e6ec7f8647633bda8da34a8bfeec" exitCode=0 Mar 10 15:20:59 crc kubenswrapper[4795]: I0310 15:20:59.697922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerDied","Data":"c412a71340f0e1e9f2202af92dd1c5c26b89e6ec7f8647633bda8da34a8bfeec"} Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.629170 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.629704 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.671825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.708595 4795 generic.go:334] "Generic (PLEG): container finished" podID="62a78361-debb-4191-af7d-24be60f6fe39" containerID="3e778be6129fedbb0bee58ccb3b5881f92b75a0a00a7c33a0e018b70fba9b11d" exitCode=0 Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.708671 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerDied","Data":"3e778be6129fedbb0bee58ccb3b5881f92b75a0a00a7c33a0e018b70fba9b11d"} Mar 10 15:21:00 crc kubenswrapper[4795]: I0310 15:21:00.753840 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.005948 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.171725 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util\") pod \"62a78361-debb-4191-af7d-24be60f6fe39\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.171938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle\") pod \"62a78361-debb-4191-af7d-24be60f6fe39\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.172018 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jprkc\" (UniqueName: \"kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc\") pod \"62a78361-debb-4191-af7d-24be60f6fe39\" (UID: \"62a78361-debb-4191-af7d-24be60f6fe39\") " Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.173700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle" (OuterVolumeSpecName: "bundle") pod "62a78361-debb-4191-af7d-24be60f6fe39" (UID: "62a78361-debb-4191-af7d-24be60f6fe39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.180853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc" (OuterVolumeSpecName: "kube-api-access-jprkc") pod "62a78361-debb-4191-af7d-24be60f6fe39" (UID: "62a78361-debb-4191-af7d-24be60f6fe39"). InnerVolumeSpecName "kube-api-access-jprkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.194159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util" (OuterVolumeSpecName: "util") pod "62a78361-debb-4191-af7d-24be60f6fe39" (UID: "62a78361-debb-4191-af7d-24be60f6fe39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.273650 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jprkc\" (UniqueName: \"kubernetes.io/projected/62a78361-debb-4191-af7d-24be60f6fe39-kube-api-access-jprkc\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.273708 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.273729 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62a78361-debb-4191-af7d-24be60f6fe39-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.729280 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" event={"ID":"62a78361-debb-4191-af7d-24be60f6fe39","Type":"ContainerDied","Data":"927a5157cb8d68da649bda04fca67725eb2a2eab029709cb875d075ea9d8bdf3"} Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.729368 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927a5157cb8d68da649bda04fca67725eb2a2eab029709cb875d075ea9d8bdf3" Mar 10 15:21:02 crc kubenswrapper[4795]: I0310 15:21:02.729372 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm" Mar 10 15:21:05 crc kubenswrapper[4795]: I0310 15:21:05.643284 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:21:05 crc kubenswrapper[4795]: I0310 15:21:05.643970 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l4xk5" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="registry-server" containerID="cri-o://676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18" gracePeriod=2 Mar 10 15:21:05 crc kubenswrapper[4795]: I0310 15:21:05.981373 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.026432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8jzc\" (UniqueName: \"kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc\") pod \"254c6166-15b9-4fd0-a181-bda892531c5b\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.026520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content\") pod \"254c6166-15b9-4fd0-a181-bda892531c5b\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.026552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities\") pod \"254c6166-15b9-4fd0-a181-bda892531c5b\" (UID: \"254c6166-15b9-4fd0-a181-bda892531c5b\") " Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.027481 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities" (OuterVolumeSpecName: "utilities") pod "254c6166-15b9-4fd0-a181-bda892531c5b" (UID: "254c6166-15b9-4fd0-a181-bda892531c5b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.031814 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc" (OuterVolumeSpecName: "kube-api-access-z8jzc") pod "254c6166-15b9-4fd0-a181-bda892531c5b" (UID: "254c6166-15b9-4fd0-a181-bda892531c5b"). InnerVolumeSpecName "kube-api-access-z8jzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.088361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "254c6166-15b9-4fd0-a181-bda892531c5b" (UID: "254c6166-15b9-4fd0-a181-bda892531c5b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.127377 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8jzc\" (UniqueName: \"kubernetes.io/projected/254c6166-15b9-4fd0-a181-bda892531c5b-kube-api-access-z8jzc\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.127407 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.127418 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/254c6166-15b9-4fd0-a181-bda892531c5b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.755973 4795 generic.go:334] "Generic (PLEG): container finished" podID="254c6166-15b9-4fd0-a181-bda892531c5b" containerID="676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18" exitCode=0 Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.756019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l4xk5" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.756034 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerDied","Data":"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18"} Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.756108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l4xk5" event={"ID":"254c6166-15b9-4fd0-a181-bda892531c5b","Type":"ContainerDied","Data":"c814f903e5d4946e69789c993e01e4dc9c23ec4c60a534b58d3a029069a03f61"} Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.756143 4795 scope.go:117] "RemoveContainer" containerID="676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.780409 4795 scope.go:117] "RemoveContainer" containerID="91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.782615 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.785796 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l4xk5"] Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.811693 4795 scope.go:117] "RemoveContainer" containerID="2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.824438 4795 scope.go:117] "RemoveContainer" containerID="676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18" Mar 10 15:21:06 crc kubenswrapper[4795]: E0310 15:21:06.824849 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18\": container with ID starting with 676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18 not found: ID does not exist" containerID="676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.824961 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18"} err="failed to get container status \"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18\": rpc error: code = NotFound desc = could not find container \"676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18\": container with ID starting with 676ff83c509d0fa0ad5dffd753b922deddf091bdcca858183a0cea69c69e9c18 not found: ID does not exist" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.825042 4795 scope.go:117] "RemoveContainer" containerID="91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78" Mar 10 15:21:06 crc kubenswrapper[4795]: E0310 15:21:06.825446 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78\": container with ID starting with 91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78 not found: ID does not exist" containerID="91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.825484 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78"} err="failed to get container status \"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78\": rpc error: code = NotFound desc = could not find container \"91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78\": container with ID starting with 91190f9c013274f7055a5bb72a869ade2beea8f78d69d764c9cf75e5dca08b78 not found: ID does not exist" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.825512 4795 scope.go:117] "RemoveContainer" containerID="2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216" Mar 10 15:21:06 crc kubenswrapper[4795]: E0310 15:21:06.825779 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216\": container with ID starting with 2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216 not found: ID does not exist" containerID="2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216" Mar 10 15:21:06 crc kubenswrapper[4795]: I0310 15:21:06.825875 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216"} err="failed to get container status \"2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216\": rpc error: code = NotFound desc = could not find container \"2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216\": container with ID starting with 2f61828d7e9b1d03e9f15e2ef28845f74683db264b5442348bdd5c315f514216 not found: ID does not exist" Mar 10 15:21:07 crc kubenswrapper[4795]: I0310 15:21:07.495199 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" path="/var/lib/kubelet/pods/254c6166-15b9-4fd0-a181-bda892531c5b/volumes" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.528896 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds"] Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529325 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="extract-content" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529336 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="extract-content" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529345 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="extract-content" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529351 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="extract-content" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529359 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="extract" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="extract" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529374 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="pull" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529380 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="pull" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529392 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="extract-utilities" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529397 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="extract-utilities" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529405 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="extract-utilities" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529410 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="extract-utilities" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529418 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529423 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529430 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529436 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529447 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="util" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529453 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="util" Mar 10 15:21:11 crc kubenswrapper[4795]: E0310 15:21:11.529462 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529468 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529568 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3befb36d-7acd-42db-96e0-433f5cf50f32" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529577 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e49156-de09-480a-933b-6815cde0b311" containerName="console" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529586 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="254c6166-15b9-4fd0-a181-bda892531c5b" containerName="registry-server" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529598 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="62a78361-debb-4191-af7d-24be60f6fe39" containerName="extract" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.529962 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.531577 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.533009 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.533134 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-b4wrq" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.533293 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.533337 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.582430 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds"] Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.693145 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-webhook-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.693199 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxbrs\" (UniqueName: \"kubernetes.io/projected/92eb5433-b34f-4b2b-bddf-ebe3de747f71-kube-api-access-xxbrs\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.693272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-apiservice-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.793872 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-apiservice-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.794156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-webhook-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.794266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxbrs\" (UniqueName: \"kubernetes.io/projected/92eb5433-b34f-4b2b-bddf-ebe3de747f71-kube-api-access-xxbrs\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.799678 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-webhook-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.810646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxbrs\" (UniqueName: \"kubernetes.io/projected/92eb5433-b34f-4b2b-bddf-ebe3de747f71-kube-api-access-xxbrs\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.817199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/92eb5433-b34f-4b2b-bddf-ebe3de747f71-apiservice-cert\") pod \"metallb-operator-controller-manager-5585467d4f-8qzds\" (UID: \"92eb5433-b34f-4b2b-bddf-ebe3de747f71\") " pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.845647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.878717 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h"] Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.879389 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.881296 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.881361 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.881559 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-9srhm" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.905719 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h"] Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.996941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnwk\" (UniqueName: \"kubernetes.io/projected/313c3847-2713-4511-ad55-237f00ee0d8e-kube-api-access-bsnwk\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.997270 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-apiservice-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:11 crc kubenswrapper[4795]: I0310 15:21:11.997308 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-webhook-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.063559 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds"] Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.089989 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.097966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-webhook-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.098124 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnwk\" (UniqueName: \"kubernetes.io/projected/313c3847-2713-4511-ad55-237f00ee0d8e-kube-api-access-bsnwk\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.098183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-apiservice-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.104162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-webhook-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.105953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/313c3847-2713-4511-ad55-237f00ee0d8e-apiservice-cert\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.124105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnwk\" (UniqueName: \"kubernetes.io/projected/313c3847-2713-4511-ad55-237f00ee0d8e-kube-api-access-bsnwk\") pod \"metallb-operator-webhook-server-58c9c77f74-sds7h\" (UID: \"313c3847-2713-4511-ad55-237f00ee0d8e\") " pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.208197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.405261 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h"] Mar 10 15:21:12 crc kubenswrapper[4795]: W0310 15:21:12.407224 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313c3847_2713_4511_ad55_237f00ee0d8e.slice/crio-764638cb958857d727b994fbb42282da44b9797ddb4982e069608d9c55c17fb5 WatchSource:0}: Error finding container 764638cb958857d727b994fbb42282da44b9797ddb4982e069608d9c55c17fb5: Status 404 returned error can't find the container with id 764638cb958857d727b994fbb42282da44b9797ddb4982e069608d9c55c17fb5 Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.793204 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" event={"ID":"313c3847-2713-4511-ad55-237f00ee0d8e","Type":"ContainerStarted","Data":"764638cb958857d727b994fbb42282da44b9797ddb4982e069608d9c55c17fb5"} Mar 10 15:21:12 crc kubenswrapper[4795]: I0310 15:21:12.795061 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" event={"ID":"92eb5433-b34f-4b2b-bddf-ebe3de747f71","Type":"ContainerStarted","Data":"31578bcdca8af2d3fee297c4992c7ed48cecf44d05701787efe9a3a45981e717"} Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.846899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" event={"ID":"313c3847-2713-4511-ad55-237f00ee0d8e","Type":"ContainerStarted","Data":"f2e74da285848b4cc9af8728ca2868e5c6738bed08a1baf519901f7386aa0071"} Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.847333 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.849095 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" event={"ID":"92eb5433-b34f-4b2b-bddf-ebe3de747f71","Type":"ContainerStarted","Data":"ae2d56b28df7cde1c9f2e45c41aabfff9f1d0decc71c8dd5ce93fd64b98cc1f2"} Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.849306 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.869757 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" podStartSLOduration=2.407021725 podStartE2EDuration="7.869732827s" podCreationTimestamp="2026-03-10 15:21:11 +0000 UTC" firstStartedPulling="2026-03-10 15:21:12.410850723 +0000 UTC m=+905.576591631" lastFinishedPulling="2026-03-10 15:21:17.873561835 +0000 UTC m=+911.039302733" observedRunningTime="2026-03-10 15:21:18.868038378 +0000 UTC m=+912.033779316" watchObservedRunningTime="2026-03-10 15:21:18.869732827 +0000 UTC m=+912.035473755" Mar 10 15:21:18 crc kubenswrapper[4795]: I0310 15:21:18.887188 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" podStartSLOduration=2.124055373 podStartE2EDuration="7.887160104s" podCreationTimestamp="2026-03-10 15:21:11 +0000 UTC" firstStartedPulling="2026-03-10 15:21:12.089726603 +0000 UTC m=+905.255467501" lastFinishedPulling="2026-03-10 15:21:17.852831324 +0000 UTC m=+911.018572232" observedRunningTime="2026-03-10 15:21:18.884906179 +0000 UTC m=+912.050647088" watchObservedRunningTime="2026-03-10 15:21:18.887160104 +0000 UTC m=+912.052901042" Mar 10 15:21:32 crc kubenswrapper[4795]: I0310 15:21:32.215760 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-58c9c77f74-sds7h" Mar 10 15:21:51 crc kubenswrapper[4795]: I0310 15:21:51.847335 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5585467d4f-8qzds" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.397557 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l6l79"] Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.402902 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.416303 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.416459 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-m5xz5" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.416807 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.428261 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt"] Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.429565 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:53 crc kubenswrapper[4795]: I0310 15:21:53.433225 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.584456 4795 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.114738174s: [/var/lib/containers/storage/overlay/51386d15b3a98a1cd25456eb0f1bde4953bec8f8c842e3eb645870b6ae71389a/diff /var/log/pods/openshift-kube-storage-version-migrator_migrator-59844c95c7-kxzzq_457e29f1-69c5-4524-a7e0-78f4944ca94d/graceful-termination/0.log]; will not log again for this container unless duration exceeds 2s Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.585597 4795 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.096182444s: [/var/lib/containers/storage/overlay/6f1877680f35c697c227958696a9d6ae0e395b15d192b95298e23d10fe14377e/diff /var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-cpg78_264d5645-f3d4-4ad1-b7ad-01ef534d4a20/cluster-samples-operator-watch/0.log]; will not log again for this container unless duration exceeds 2s Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.585917 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnqld\" (UniqueName: \"kubernetes.io/projected/ecac241b-bada-4193-b7e0-e771dce28a24-kube-api-access-tnqld\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586000 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-conf\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmb85\" (UniqueName: \"kubernetes.io/projected/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-kube-api-access-rmb85\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586146 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-startup\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecac241b-bada-4193-b7e0-e771dce28a24-cert\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586201 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-sockets\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586226 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-reloader\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.586247 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics-certs\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.595023 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt"] Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.634664 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-2tb54"] Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.635885 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.654815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-f5hhw" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.655497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.655581 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.655616 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.662442 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-r22j4"] Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.663401 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.666546 4795 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691560 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691640 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-conf\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmb85\" (UniqueName: \"kubernetes.io/projected/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-kube-api-access-rmb85\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691737 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d1c83bb-1fd9-4500-95b5-ded04a953128-metallb-excludel2\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htm5f\" (UniqueName: \"kubernetes.io/projected/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-kube-api-access-htm5f\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691873 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-startup\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecac241b-bada-4193-b7e0-e771dce28a24-cert\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691924 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-sockets\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691946 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-reloader\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics-certs\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.691989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnqld\" (UniqueName: \"kubernetes.io/projected/ecac241b-bada-4193-b7e0-e771dce28a24-kube-api-access-tnqld\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.692011 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-cert\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.692033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dchwd\" (UniqueName: \"kubernetes.io/projected/2d1c83bb-1fd9-4500-95b5-ded04a953128-kube-api-access-dchwd\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.692089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.693015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-conf\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.698943 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.699505 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-reloader\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.699876 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-sockets\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.701116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-frr-startup\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.704656 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-r22j4"] Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.720187 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-metrics-certs\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.724299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ecac241b-bada-4193-b7e0-e771dce28a24-cert\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.727089 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnqld\" (UniqueName: \"kubernetes.io/projected/ecac241b-bada-4193-b7e0-e771dce28a24-kube-api-access-tnqld\") pod \"frr-k8s-webhook-server-7f989f654f-hsznt\" (UID: \"ecac241b-bada-4193-b7e0-e771dce28a24\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.747057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmb85\" (UniqueName: \"kubernetes.io/projected/8a0a1463-1ecd-456c-9e61-7d954ebcbce4-kube-api-access-rmb85\") pod \"frr-k8s-l6l79\" (UID: \"8a0a1463-1ecd-456c-9e61-7d954ebcbce4\") " pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793682 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-cert\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793745 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dchwd\" (UniqueName: \"kubernetes.io/projected/2d1c83bb-1fd9-4500-95b5-ded04a953128-kube-api-access-dchwd\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793774 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793858 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793892 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d1c83bb-1fd9-4500-95b5-ded04a953128-metallb-excludel2\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.793919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htm5f\" (UniqueName: \"kubernetes.io/projected/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-kube-api-access-htm5f\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794498 4795 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794578 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs podName:2d1c83bb-1fd9-4500-95b5-ded04a953128 nodeName:}" failed. No retries permitted until 2026-03-10 15:21:55.294559177 +0000 UTC m=+948.460300075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs") pod "speaker-2tb54" (UID: "2d1c83bb-1fd9-4500-95b5-ded04a953128") : secret "speaker-certs-secret" not found Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794671 4795 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794730 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs podName:1ca8c314-7be3-4435-bb1c-8a27d57e2f3d nodeName:}" failed. No retries permitted until 2026-03-10 15:21:55.294712991 +0000 UTC m=+948.460453939 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs") pod "controller-86ddb6bd46-r22j4" (UID: "1ca8c314-7be3-4435-bb1c-8a27d57e2f3d") : secret "controller-certs-secret" not found Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794779 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 15:21:54 crc kubenswrapper[4795]: E0310 15:21:54.794803 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist podName:2d1c83bb-1fd9-4500-95b5-ded04a953128 nodeName:}" failed. No retries permitted until 2026-03-10 15:21:55.294795154 +0000 UTC m=+948.460536112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist") pod "speaker-2tb54" (UID: "2d1c83bb-1fd9-4500-95b5-ded04a953128") : secret "metallb-memberlist" not found Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.795554 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d1c83bb-1fd9-4500-95b5-ded04a953128-metallb-excludel2\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.800772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-cert\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.817560 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dchwd\" (UniqueName: \"kubernetes.io/projected/2d1c83bb-1fd9-4500-95b5-ded04a953128-kube-api-access-dchwd\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.818178 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htm5f\" (UniqueName: \"kubernetes.io/projected/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-kube-api-access-htm5f\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.933555 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:21:54 crc kubenswrapper[4795]: I0310 15:21:54.983081 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.233900 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt"] Mar 10 15:21:55 crc kubenswrapper[4795]: W0310 15:21:55.245248 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecac241b_bada_4193_b7e0_e771dce28a24.slice/crio-4ffbbf6bfda3786e4c8fb1e4f4bd25ec5d95038869751c3cd577e319beec8c29 WatchSource:0}: Error finding container 4ffbbf6bfda3786e4c8fb1e4f4bd25ec5d95038869751c3cd577e319beec8c29: Status 404 returned error can't find the container with id 4ffbbf6bfda3786e4c8fb1e4f4bd25ec5d95038869751c3cd577e319beec8c29 Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.305661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.305750 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.306644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:55 crc kubenswrapper[4795]: E0310 15:21:55.306809 4795 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 10 15:21:55 crc kubenswrapper[4795]: E0310 15:21:55.306891 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist podName:2d1c83bb-1fd9-4500-95b5-ded04a953128 nodeName:}" failed. No retries permitted until 2026-03-10 15:21:56.306873409 +0000 UTC m=+949.472614307 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist") pod "speaker-2tb54" (UID: "2d1c83bb-1fd9-4500-95b5-ded04a953128") : secret "metallb-memberlist" not found Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.311662 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-metrics-certs\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.311719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1ca8c314-7be3-4435-bb1c-8a27d57e2f3d-metrics-certs\") pod \"controller-86ddb6bd46-r22j4\" (UID: \"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d\") " pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.381421 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.605126 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-r22j4"] Mar 10 15:21:55 crc kubenswrapper[4795]: W0310 15:21:55.612940 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca8c314_7be3_4435_bb1c_8a27d57e2f3d.slice/crio-9f6ae3cf49319e81673057cf34e920bafd225683ee56ae5ae503c4f5e9cfa462 WatchSource:0}: Error finding container 9f6ae3cf49319e81673057cf34e920bafd225683ee56ae5ae503c4f5e9cfa462: Status 404 returned error can't find the container with id 9f6ae3cf49319e81673057cf34e920bafd225683ee56ae5ae503c4f5e9cfa462 Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.646380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" event={"ID":"ecac241b-bada-4193-b7e0-e771dce28a24","Type":"ContainerStarted","Data":"4ffbbf6bfda3786e4c8fb1e4f4bd25ec5d95038869751c3cd577e319beec8c29"} Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.650291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"be509f7f1c01e6cfeb8e079a96d7efddedede0eb7e16c43b3a3375bb39636cec"} Mar 10 15:21:55 crc kubenswrapper[4795]: I0310 15:21:55.652433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-r22j4" event={"ID":"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d","Type":"ContainerStarted","Data":"9f6ae3cf49319e81673057cf34e920bafd225683ee56ae5ae503c4f5e9cfa462"} Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.323087 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.329343 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d1c83bb-1fd9-4500-95b5-ded04a953128-memberlist\") pod \"speaker-2tb54\" (UID: \"2d1c83bb-1fd9-4500-95b5-ded04a953128\") " pod="metallb-system/speaker-2tb54" Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.571706 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-2tb54" Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.659438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2tb54" event={"ID":"2d1c83bb-1fd9-4500-95b5-ded04a953128","Type":"ContainerStarted","Data":"a8aeb277cd742095fd92dd4c88bfff54d53d86dce7dbe2ab3406b528ee2c2251"} Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.661584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-r22j4" event={"ID":"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d","Type":"ContainerStarted","Data":"618be0727d7221d2b0c630d29a8ed80f521b59ebb899c2678d63ff17fdb75ab6"} Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.661626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-r22j4" event={"ID":"1ca8c314-7be3-4435-bb1c-8a27d57e2f3d","Type":"ContainerStarted","Data":"a4e1318945d2c3d8e20e7567f74e0370ac271aad1e6225ad50e0f779e0ef1995"} Mar 10 15:21:56 crc kubenswrapper[4795]: I0310 15:21:56.661715 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:21:57 crc kubenswrapper[4795]: I0310 15:21:57.505975 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-r22j4" podStartSLOduration=4.505960478 podStartE2EDuration="4.505960478s" podCreationTimestamp="2026-03-10 15:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:21:56.674832064 +0000 UTC m=+949.840572972" watchObservedRunningTime="2026-03-10 15:21:57.505960478 +0000 UTC m=+950.671701376" Mar 10 15:21:57 crc kubenswrapper[4795]: I0310 15:21:57.668774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2tb54" event={"ID":"2d1c83bb-1fd9-4500-95b5-ded04a953128","Type":"ContainerStarted","Data":"418299d66ce1f1efc2ce5ba2b7dad280d304dfbf3341a8f05c23472e6592ad5b"} Mar 10 15:21:57 crc kubenswrapper[4795]: I0310 15:21:57.669370 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-2tb54" event={"ID":"2d1c83bb-1fd9-4500-95b5-ded04a953128","Type":"ContainerStarted","Data":"15e72b5c79be91673d9f4ebfa90589b6c8c9e97880598690b82e644adfac0515"} Mar 10 15:21:57 crc kubenswrapper[4795]: I0310 15:21:57.687142 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-2tb54" podStartSLOduration=4.687123755 podStartE2EDuration="4.687123755s" podCreationTimestamp="2026-03-10 15:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:21:57.683790999 +0000 UTC m=+950.849531917" watchObservedRunningTime="2026-03-10 15:21:57.687123755 +0000 UTC m=+950.852864653" Mar 10 15:21:58 crc kubenswrapper[4795]: I0310 15:21:58.673761 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-2tb54" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.135793 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552602-h76ns"] Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.137014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.139795 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.140057 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.140232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.142794 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-h76ns"] Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.196874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvwg7\" (UniqueName: \"kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7\") pod \"auto-csr-approver-29552602-h76ns\" (UID: \"fe4cc933-6bcb-4283-bca6-8645ac162270\") " pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.297713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvwg7\" (UniqueName: \"kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7\") pod \"auto-csr-approver-29552602-h76ns\" (UID: \"fe4cc933-6bcb-4283-bca6-8645ac162270\") " pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.332550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvwg7\" (UniqueName: \"kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7\") pod \"auto-csr-approver-29552602-h76ns\" (UID: \"fe4cc933-6bcb-4283-bca6-8645ac162270\") " pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.474359 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:00 crc kubenswrapper[4795]: I0310 15:22:00.920328 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-h76ns"] Mar 10 15:22:01 crc kubenswrapper[4795]: I0310 15:22:01.697357 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-h76ns" event={"ID":"fe4cc933-6bcb-4283-bca6-8645ac162270","Type":"ContainerStarted","Data":"3d087ae9c0036f0ce62333b9df670472227f4aed478a7db1a9c9ae5e396af4d9"} Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.299011 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.300713 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.326255 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.370712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.370765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mb8\" (UniqueName: \"kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.371022 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.472153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.472235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.472261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mb8\" (UniqueName: \"kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.472760 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.472926 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.491313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mb8\" (UniqueName: \"kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8\") pod \"certified-operators-zzj4q\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.654120 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.723444 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a0a1463-1ecd-456c-9e61-7d954ebcbce4" containerID="e008c9a63ac9f0ef1a0ef83363fb497afc12aa427452ecdf72e714fd66496211" exitCode=0 Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.723491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerDied","Data":"e008c9a63ac9f0ef1a0ef83363fb497afc12aa427452ecdf72e714fd66496211"} Mar 10 15:22:04 crc kubenswrapper[4795]: I0310 15:22:04.977730 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.388997 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-r22j4" Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.731923 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a0a1463-1ecd-456c-9e61-7d954ebcbce4" containerID="a01fa7d285790f16eeab175d807ad61b5f076bf80c28398a1ad0bd71c6b62350" exitCode=0 Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.731974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerDied","Data":"a01fa7d285790f16eeab175d807ad61b5f076bf80c28398a1ad0bd71c6b62350"} Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.735878 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerID="0bcf4375e4e5718dbdbc3f62fe43dcd7486a17af8ddf8336df7c76fd9fad42e8" exitCode=0 Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.735964 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerDied","Data":"0bcf4375e4e5718dbdbc3f62fe43dcd7486a17af8ddf8336df7c76fd9fad42e8"} Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.735994 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerStarted","Data":"4bd0fdfe7ec129a00442d42156271ad3ceb0439ef75029789da512e3fb105187"} Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.738559 4795 generic.go:334] "Generic (PLEG): container finished" podID="fe4cc933-6bcb-4283-bca6-8645ac162270" containerID="e2375420575b7940817451a4f11faa3d001564ebc848513489f589cf30f357a6" exitCode=0 Mar 10 15:22:05 crc kubenswrapper[4795]: I0310 15:22:05.738625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-h76ns" event={"ID":"fe4cc933-6bcb-4283-bca6-8645ac162270","Type":"ContainerDied","Data":"e2375420575b7940817451a4f11faa3d001564ebc848513489f589cf30f357a6"} Mar 10 15:22:06 crc kubenswrapper[4795]: I0310 15:22:06.576719 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-2tb54" Mar 10 15:22:06 crc kubenswrapper[4795]: I0310 15:22:06.748161 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerStarted","Data":"f442facc525a79630313b9bdc7df58ae475a5fedc5949ed4b8ebb3f92d52a159"} Mar 10 15:22:06 crc kubenswrapper[4795]: I0310 15:22:06.752428 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a0a1463-1ecd-456c-9e61-7d954ebcbce4" containerID="ce1859effd7d9e69806598e6aab67d3564e5db5d52fcb23fa37a05782f5f79d7" exitCode=0 Mar 10 15:22:06 crc kubenswrapper[4795]: I0310 15:22:06.752592 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerDied","Data":"ce1859effd7d9e69806598e6aab67d3564e5db5d52fcb23fa37a05782f5f79d7"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.051873 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.127561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvwg7\" (UniqueName: \"kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7\") pod \"fe4cc933-6bcb-4283-bca6-8645ac162270\" (UID: \"fe4cc933-6bcb-4283-bca6-8645ac162270\") " Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.136363 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7" (OuterVolumeSpecName: "kube-api-access-vvwg7") pod "fe4cc933-6bcb-4283-bca6-8645ac162270" (UID: "fe4cc933-6bcb-4283-bca6-8645ac162270"). InnerVolumeSpecName "kube-api-access-vvwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.229201 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvwg7\" (UniqueName: \"kubernetes.io/projected/fe4cc933-6bcb-4283-bca6-8645ac162270-kube-api-access-vvwg7\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.760259 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerID="f442facc525a79630313b9bdc7df58ae475a5fedc5949ed4b8ebb3f92d52a159" exitCode=0 Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.760317 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerDied","Data":"f442facc525a79630313b9bdc7df58ae475a5fedc5949ed4b8ebb3f92d52a159"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.764041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552602-h76ns" event={"ID":"fe4cc933-6bcb-4283-bca6-8645ac162270","Type":"ContainerDied","Data":"3d087ae9c0036f0ce62333b9df670472227f4aed478a7db1a9c9ae5e396af4d9"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.764115 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d087ae9c0036f0ce62333b9df670472227f4aed478a7db1a9c9ae5e396af4d9" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.764326 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552602-h76ns" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.767689 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" event={"ID":"ecac241b-bada-4193-b7e0-e771dce28a24","Type":"ContainerStarted","Data":"a83c48531f49ef10b7222d9557354d3b01e60ad391ba05813013a71e0a0457b9"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.767743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.773663 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"c4ea61ac520a1271e8509856dcf985803787e1d4cdb095ed9d695c0d1f2857ca"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.773707 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"9e3e8f1aa1d21437c25ddaff251260518bdebd49644f930fdf8da23203bab330"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.773719 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"38809609ee5c82e7a5e19840686bad4101f0c7242c7b27875d960a5a024628e8"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.773729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"272f6cdf8de3a11819edb010e5c275227c0a90d338e3205c4c6dfdfa90e58b26"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.773740 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"a80087b10fbf112f61cc6e8c7e3b0421b58f60b4fc427d8593c68e20bb28787a"} Mar 10 15:22:07 crc kubenswrapper[4795]: I0310 15:22:07.871537 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" podStartSLOduration=2.83534678 podStartE2EDuration="14.871515739s" podCreationTimestamp="2026-03-10 15:21:53 +0000 UTC" firstStartedPulling="2026-03-10 15:21:55.249542164 +0000 UTC m=+948.415283072" lastFinishedPulling="2026-03-10 15:22:07.285711133 +0000 UTC m=+960.451452031" observedRunningTime="2026-03-10 15:22:07.819695661 +0000 UTC m=+960.985436559" watchObservedRunningTime="2026-03-10 15:22:07.871515739 +0000 UTC m=+961.037256637" Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.096501 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-vvqwp"] Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.100478 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552596-vvqwp"] Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.786197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l6l79" event={"ID":"8a0a1463-1ecd-456c-9e61-7d954ebcbce4","Type":"ContainerStarted","Data":"a53fc86f4b58fa5fcad31171b403e5441e170c7fed5926bcf80e77a7566d5927"} Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.787163 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.790312 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerStarted","Data":"fb94f1e2269bb97c292242ab657185882407cb85af687a23725d0622ad4516d8"} Mar 10 15:22:08 crc kubenswrapper[4795]: I0310 15:22:08.824745 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l6l79" podStartSLOduration=6.567455754 podStartE2EDuration="15.824726656s" podCreationTimestamp="2026-03-10 15:21:53 +0000 UTC" firstStartedPulling="2026-03-10 15:21:55.264389317 +0000 UTC m=+948.430130215" lastFinishedPulling="2026-03-10 15:22:04.521660219 +0000 UTC m=+957.687401117" observedRunningTime="2026-03-10 15:22:08.820836895 +0000 UTC m=+961.986577803" watchObservedRunningTime="2026-03-10 15:22:08.824726656 +0000 UTC m=+961.990467574" Mar 10 15:22:09 crc kubenswrapper[4795]: I0310 15:22:09.486815 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb92169a-5109-4bf5-85e4-313837d438d4" path="/var/lib/kubelet/pods/eb92169a-5109-4bf5-85e4-313837d438d4/volumes" Mar 10 15:22:09 crc kubenswrapper[4795]: I0310 15:22:09.934274 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:22:09 crc kubenswrapper[4795]: I0310 15:22:09.970003 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:22:09 crc kubenswrapper[4795]: I0310 15:22:09.996885 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zzj4q" podStartSLOduration=3.483749541 podStartE2EDuration="5.996866956s" podCreationTimestamp="2026-03-10 15:22:04 +0000 UTC" firstStartedPulling="2026-03-10 15:22:05.737249539 +0000 UTC m=+958.902990437" lastFinishedPulling="2026-03-10 15:22:08.250366954 +0000 UTC m=+961.416107852" observedRunningTime="2026-03-10 15:22:08.843372178 +0000 UTC m=+962.009113116" watchObservedRunningTime="2026-03-10 15:22:09.996866956 +0000 UTC m=+963.162607864" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.686655 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bpvzt"] Mar 10 15:22:12 crc kubenswrapper[4795]: E0310 15:22:12.687515 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe4cc933-6bcb-4283-bca6-8645ac162270" containerName="oc" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.687536 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe4cc933-6bcb-4283-bca6-8645ac162270" containerName="oc" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.687694 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe4cc933-6bcb-4283-bca6-8645ac162270" containerName="oc" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.688197 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.691671 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ctfjx" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.692021 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.692237 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.693798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bpvzt"] Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.830121 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2v22\" (UniqueName: \"kubernetes.io/projected/15382ecd-b669-452c-8cee-6abdc8828035-kube-api-access-h2v22\") pod \"openstack-operator-index-bpvzt\" (UID: \"15382ecd-b669-452c-8cee-6abdc8828035\") " pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.932223 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2v22\" (UniqueName: \"kubernetes.io/projected/15382ecd-b669-452c-8cee-6abdc8828035-kube-api-access-h2v22\") pod \"openstack-operator-index-bpvzt\" (UID: \"15382ecd-b669-452c-8cee-6abdc8828035\") " pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:12 crc kubenswrapper[4795]: I0310 15:22:12.960908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2v22\" (UniqueName: \"kubernetes.io/projected/15382ecd-b669-452c-8cee-6abdc8828035-kube-api-access-h2v22\") pod \"openstack-operator-index-bpvzt\" (UID: \"15382ecd-b669-452c-8cee-6abdc8828035\") " pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:13 crc kubenswrapper[4795]: I0310 15:22:13.009105 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:13 crc kubenswrapper[4795]: I0310 15:22:13.449372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bpvzt"] Mar 10 15:22:13 crc kubenswrapper[4795]: W0310 15:22:13.464840 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15382ecd_b669_452c_8cee_6abdc8828035.slice/crio-f29f0fd235cb9e398425a784d661d3bb03f11b56235544518a0c3e97f2975914 WatchSource:0}: Error finding container f29f0fd235cb9e398425a784d661d3bb03f11b56235544518a0c3e97f2975914: Status 404 returned error can't find the container with id f29f0fd235cb9e398425a784d661d3bb03f11b56235544518a0c3e97f2975914 Mar 10 15:22:13 crc kubenswrapper[4795]: I0310 15:22:13.831938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bpvzt" event={"ID":"15382ecd-b669-452c-8cee-6abdc8828035","Type":"ContainerStarted","Data":"f29f0fd235cb9e398425a784d661d3bb03f11b56235544518a0c3e97f2975914"} Mar 10 15:22:14 crc kubenswrapper[4795]: I0310 15:22:14.680049 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:14 crc kubenswrapper[4795]: I0310 15:22:14.681505 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:14 crc kubenswrapper[4795]: I0310 15:22:14.754518 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:14 crc kubenswrapper[4795]: I0310 15:22:14.899783 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:16 crc kubenswrapper[4795]: I0310 15:22:16.855731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bpvzt" event={"ID":"15382ecd-b669-452c-8cee-6abdc8828035","Type":"ContainerStarted","Data":"85fb3304b7b71997cabfb1bfda36e710ea1c7b5289ee7d6ff83921d5b7a57542"} Mar 10 15:22:16 crc kubenswrapper[4795]: I0310 15:22:16.890520 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bpvzt" podStartSLOduration=1.809545108 podStartE2EDuration="4.890470617s" podCreationTimestamp="2026-03-10 15:22:12 +0000 UTC" firstStartedPulling="2026-03-10 15:22:13.469161139 +0000 UTC m=+966.634902077" lastFinishedPulling="2026-03-10 15:22:16.550086688 +0000 UTC m=+969.715827586" observedRunningTime="2026-03-10 15:22:16.875837769 +0000 UTC m=+970.041578727" watchObservedRunningTime="2026-03-10 15:22:16.890470617 +0000 UTC m=+970.056211575" Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.539023 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.539536 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.692285 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.692738 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zzj4q" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="registry-server" containerID="cri-o://fb94f1e2269bb97c292242ab657185882407cb85af687a23725d0622ad4516d8" gracePeriod=2 Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.878197 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerID="fb94f1e2269bb97c292242ab657185882407cb85af687a23725d0622ad4516d8" exitCode=0 Mar 10 15:22:18 crc kubenswrapper[4795]: I0310 15:22:18.878283 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerDied","Data":"fb94f1e2269bb97c292242ab657185882407cb85af687a23725d0622ad4516d8"} Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.121905 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.163645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content\") pod \"d1057884-75fd-49cf-a65d-b1ee9a610700\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.163756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7mb8\" (UniqueName: \"kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8\") pod \"d1057884-75fd-49cf-a65d-b1ee9a610700\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.163779 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities\") pod \"d1057884-75fd-49cf-a65d-b1ee9a610700\" (UID: \"d1057884-75fd-49cf-a65d-b1ee9a610700\") " Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.164624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities" (OuterVolumeSpecName: "utilities") pod "d1057884-75fd-49cf-a65d-b1ee9a610700" (UID: "d1057884-75fd-49cf-a65d-b1ee9a610700"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.177439 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8" (OuterVolumeSpecName: "kube-api-access-r7mb8") pod "d1057884-75fd-49cf-a65d-b1ee9a610700" (UID: "d1057884-75fd-49cf-a65d-b1ee9a610700"). InnerVolumeSpecName "kube-api-access-r7mb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.222254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d1057884-75fd-49cf-a65d-b1ee9a610700" (UID: "d1057884-75fd-49cf-a65d-b1ee9a610700"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.265126 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7mb8\" (UniqueName: \"kubernetes.io/projected/d1057884-75fd-49cf-a65d-b1ee9a610700-kube-api-access-r7mb8\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.265160 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.265172 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d1057884-75fd-49cf-a65d-b1ee9a610700-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.890374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zzj4q" event={"ID":"d1057884-75fd-49cf-a65d-b1ee9a610700","Type":"ContainerDied","Data":"4bd0fdfe7ec129a00442d42156271ad3ceb0439ef75029789da512e3fb105187"} Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.890498 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zzj4q" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.891448 4795 scope.go:117] "RemoveContainer" containerID="fb94f1e2269bb97c292242ab657185882407cb85af687a23725d0622ad4516d8" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.937680 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.937722 4795 scope.go:117] "RemoveContainer" containerID="f442facc525a79630313b9bdc7df58ae475a5fedc5949ed4b8ebb3f92d52a159" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.968088 4795 scope.go:117] "RemoveContainer" containerID="0bcf4375e4e5718dbdbc3f62fe43dcd7486a17af8ddf8336df7c76fd9fad42e8" Mar 10 15:22:19 crc kubenswrapper[4795]: I0310 15:22:19.980564 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zzj4q"] Mar 10 15:22:21 crc kubenswrapper[4795]: I0310 15:22:21.493839 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" path="/var/lib/kubelet/pods/d1057884-75fd-49cf-a65d-b1ee9a610700/volumes" Mar 10 15:22:23 crc kubenswrapper[4795]: I0310 15:22:23.010488 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:23 crc kubenswrapper[4795]: I0310 15:22:23.012947 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:23 crc kubenswrapper[4795]: I0310 15:22:23.055606 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:23 crc kubenswrapper[4795]: I0310 15:22:23.967340 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-bpvzt" Mar 10 15:22:24 crc kubenswrapper[4795]: I0310 15:22:24.937620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l6l79" Mar 10 15:22:24 crc kubenswrapper[4795]: I0310 15:22:24.995495 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-hsznt" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.521309 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz"] Mar 10 15:22:25 crc kubenswrapper[4795]: E0310 15:22:25.521743 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="registry-server" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.521755 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="registry-server" Mar 10 15:22:25 crc kubenswrapper[4795]: E0310 15:22:25.521774 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="extract-utilities" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.521780 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="extract-utilities" Mar 10 15:22:25 crc kubenswrapper[4795]: E0310 15:22:25.521787 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="extract-content" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.521792 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="extract-content" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.521893 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1057884-75fd-49cf-a65d-b1ee9a610700" containerName="registry-server" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.522640 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.524468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-bvqpm" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.538399 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz"] Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.569321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.569574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.569665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gh8z\" (UniqueName: \"kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.671139 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.671283 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.671342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gh8z\" (UniqueName: \"kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.672317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.672956 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.707144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gh8z\" (UniqueName: \"kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z\") pod \"f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:25 crc kubenswrapper[4795]: I0310 15:22:25.842241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:26 crc kubenswrapper[4795]: I0310 15:22:26.145235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz"] Mar 10 15:22:26 crc kubenswrapper[4795]: W0310 15:22:26.155287 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ffa720_9d8d_48bb_ba13_87d605145e4e.slice/crio-1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c WatchSource:0}: Error finding container 1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c: Status 404 returned error can't find the container with id 1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c Mar 10 15:22:26 crc kubenswrapper[4795]: I0310 15:22:26.952716 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerID="ff2f4a0c01ff67dd8ee03bbf1c4e83ba379a1e22d61ee46fe2f34e2537a7f5d3" exitCode=0 Mar 10 15:22:26 crc kubenswrapper[4795]: I0310 15:22:26.952826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" event={"ID":"d2ffa720-9d8d-48bb-ba13-87d605145e4e","Type":"ContainerDied","Data":"ff2f4a0c01ff67dd8ee03bbf1c4e83ba379a1e22d61ee46fe2f34e2537a7f5d3"} Mar 10 15:22:26 crc kubenswrapper[4795]: I0310 15:22:26.952901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" event={"ID":"d2ffa720-9d8d-48bb-ba13-87d605145e4e","Type":"ContainerStarted","Data":"1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c"} Mar 10 15:22:27 crc kubenswrapper[4795]: I0310 15:22:27.964847 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerID="433e670a5cbb9b0eef0988635832c42ea0513f9886d30fd45bb0206b969aea04" exitCode=0 Mar 10 15:22:27 crc kubenswrapper[4795]: I0310 15:22:27.964909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" event={"ID":"d2ffa720-9d8d-48bb-ba13-87d605145e4e","Type":"ContainerDied","Data":"433e670a5cbb9b0eef0988635832c42ea0513f9886d30fd45bb0206b969aea04"} Mar 10 15:22:28 crc kubenswrapper[4795]: I0310 15:22:28.978188 4795 generic.go:334] "Generic (PLEG): container finished" podID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerID="b1ee5feec3a25690474702e5cc289cdf406d2be3e1e304753a63295f71d34ff3" exitCode=0 Mar 10 15:22:28 crc kubenswrapper[4795]: I0310 15:22:28.978253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" event={"ID":"d2ffa720-9d8d-48bb-ba13-87d605145e4e","Type":"ContainerDied","Data":"b1ee5feec3a25690474702e5cc289cdf406d2be3e1e304753a63295f71d34ff3"} Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.302575 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.438160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle\") pod \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.438263 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util\") pod \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.438382 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gh8z\" (UniqueName: \"kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z\") pod \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\" (UID: \"d2ffa720-9d8d-48bb-ba13-87d605145e4e\") " Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.439357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle" (OuterVolumeSpecName: "bundle") pod "d2ffa720-9d8d-48bb-ba13-87d605145e4e" (UID: "d2ffa720-9d8d-48bb-ba13-87d605145e4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.444824 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z" (OuterVolumeSpecName: "kube-api-access-7gh8z") pod "d2ffa720-9d8d-48bb-ba13-87d605145e4e" (UID: "d2ffa720-9d8d-48bb-ba13-87d605145e4e"). InnerVolumeSpecName "kube-api-access-7gh8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.452152 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util" (OuterVolumeSpecName: "util") pod "d2ffa720-9d8d-48bb-ba13-87d605145e4e" (UID: "d2ffa720-9d8d-48bb-ba13-87d605145e4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.539870 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gh8z\" (UniqueName: \"kubernetes.io/projected/d2ffa720-9d8d-48bb-ba13-87d605145e4e-kube-api-access-7gh8z\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.539931 4795 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:30 crc kubenswrapper[4795]: I0310 15:22:30.540114 4795 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d2ffa720-9d8d-48bb-ba13-87d605145e4e-util\") on node \"crc\" DevicePath \"\"" Mar 10 15:22:31 crc kubenswrapper[4795]: I0310 15:22:30.999736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" event={"ID":"d2ffa720-9d8d-48bb-ba13-87d605145e4e","Type":"ContainerDied","Data":"1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c"} Mar 10 15:22:31 crc kubenswrapper[4795]: I0310 15:22:30.999825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz" Mar 10 15:22:31 crc kubenswrapper[4795]: I0310 15:22:30.999844 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e88c0f1b5f5a0b09b56778f7c62449ae835e0ad0aa2c4ecbe9a8421db9bf84c" Mar 10 15:22:31 crc kubenswrapper[4795]: I0310 15:22:31.568460 4795 scope.go:117] "RemoveContainer" containerID="26b6215e3d8f46effa1b154c6dc341ea80b349d32bb0d7b5ca65a972d6372eb5" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.517025 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck"] Mar 10 15:22:37 crc kubenswrapper[4795]: E0310 15:22:37.517607 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="util" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.517620 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="util" Mar 10 15:22:37 crc kubenswrapper[4795]: E0310 15:22:37.517635 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="extract" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.517642 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="extract" Mar 10 15:22:37 crc kubenswrapper[4795]: E0310 15:22:37.517655 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="pull" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.517661 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="pull" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.517754 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ffa720-9d8d-48bb-ba13-87d605145e4e" containerName="extract" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.518142 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.523828 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-l26m2" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.535847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck"] Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.636685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk72l\" (UniqueName: \"kubernetes.io/projected/cb672437-b689-4c5a-ba13-934f96350bbf-kube-api-access-tk72l\") pod \"openstack-operator-controller-init-7c7f7d994-987ck\" (UID: \"cb672437-b689-4c5a-ba13-934f96350bbf\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.737378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk72l\" (UniqueName: \"kubernetes.io/projected/cb672437-b689-4c5a-ba13-934f96350bbf-kube-api-access-tk72l\") pod \"openstack-operator-controller-init-7c7f7d994-987ck\" (UID: \"cb672437-b689-4c5a-ba13-934f96350bbf\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.776278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk72l\" (UniqueName: \"kubernetes.io/projected/cb672437-b689-4c5a-ba13-934f96350bbf-kube-api-access-tk72l\") pod \"openstack-operator-controller-init-7c7f7d994-987ck\" (UID: \"cb672437-b689-4c5a-ba13-934f96350bbf\") " pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:37 crc kubenswrapper[4795]: I0310 15:22:37.835729 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:38 crc kubenswrapper[4795]: I0310 15:22:38.090384 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck"] Mar 10 15:22:39 crc kubenswrapper[4795]: I0310 15:22:39.076829 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" event={"ID":"cb672437-b689-4c5a-ba13-934f96350bbf","Type":"ContainerStarted","Data":"27e388dc0313c3be49a1c27605662236fa869f685c8ab351fad9365e7199f754"} Mar 10 15:22:42 crc kubenswrapper[4795]: I0310 15:22:42.095926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" event={"ID":"cb672437-b689-4c5a-ba13-934f96350bbf","Type":"ContainerStarted","Data":"1d7715a843f0986c17d11711156927fcf5d2138f9fc4068c6f31f3c5aa089784"} Mar 10 15:22:42 crc kubenswrapper[4795]: I0310 15:22:42.096756 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:42 crc kubenswrapper[4795]: I0310 15:22:42.137612 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" podStartSLOduration=1.621017157 podStartE2EDuration="5.137582583s" podCreationTimestamp="2026-03-10 15:22:37 +0000 UTC" firstStartedPulling="2026-03-10 15:22:38.113861552 +0000 UTC m=+991.279602450" lastFinishedPulling="2026-03-10 15:22:41.630426968 +0000 UTC m=+994.796167876" observedRunningTime="2026-03-10 15:22:42.128489663 +0000 UTC m=+995.294230611" watchObservedRunningTime="2026-03-10 15:22:42.137582583 +0000 UTC m=+995.303323521" Mar 10 15:22:47 crc kubenswrapper[4795]: I0310 15:22:47.838711 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7c7f7d994-987ck" Mar 10 15:22:48 crc kubenswrapper[4795]: I0310 15:22:48.539904 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:22:48 crc kubenswrapper[4795]: I0310 15:22:48.540473 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.081278 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.082767 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.085883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-v7np6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.087221 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.087691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vvj\" (UniqueName: \"kubernetes.io/projected/c5c14124-2fc6-4052-b12a-81336c47ae33-kube-api-access-c7vvj\") pod \"barbican-operator-controller-manager-677bd678f7-rgwvl\" (UID: \"c5c14124-2fc6-4052-b12a-81336c47ae33\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.088405 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.091517 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-c7fsv" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.098231 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.107602 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.136018 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.137009 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.141532 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.142577 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.144079 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-vbnmg" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.144678 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-vqh79" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.150145 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.150873 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.154986 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-hhzpg" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.168844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.172548 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.176150 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.189707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vvj\" (UniqueName: \"kubernetes.io/projected/c5c14124-2fc6-4052-b12a-81336c47ae33-kube-api-access-c7vvj\") pod \"barbican-operator-controller-manager-677bd678f7-rgwvl\" (UID: \"c5c14124-2fc6-4052-b12a-81336c47ae33\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.190889 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.191745 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.198242 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.200728 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-7nk7x" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.202152 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.204486 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.211480 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-mj7q8" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.213011 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.231046 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.232106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.234460 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-klrk9" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.236720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vvj\" (UniqueName: \"kubernetes.io/projected/c5c14124-2fc6-4052-b12a-81336c47ae33-kube-api-access-c7vvj\") pod \"barbican-operator-controller-manager-677bd678f7-rgwvl\" (UID: \"c5c14124-2fc6-4052-b12a-81336c47ae33\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.250566 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.251358 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.256341 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.265987 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-6sxj8" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.270199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.286281 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.286975 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.314595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9lv\" (UniqueName: \"kubernetes.io/projected/552bb17b-df18-40ea-8688-f5f5c16e7c5b-kube-api-access-fw9lv\") pod \"glance-operator-controller-manager-5964f64c48-qhk6r\" (UID: \"552bb17b-df18-40ea-8688-f5f5c16e7c5b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.314696 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgbzr\" (UniqueName: \"kubernetes.io/projected/69ee85e7-4d8f-493c-8480-6eefec2091ae-kube-api-access-kgbzr\") pod \"cinder-operator-controller-manager-984cd4dcf-8tpdc\" (UID: \"69ee85e7-4d8f-493c-8480-6eefec2091ae\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.314758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4md\" (UniqueName: \"kubernetes.io/projected/a8f94485-8cb5-43ab-b30d-2ad0b0a7836a-kube-api-access-7t4md\") pod \"designate-operator-controller-manager-66d56f6ff4-tlmlb\" (UID: \"a8f94485-8cb5-43ab-b30d-2ad0b0a7836a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.314904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg9mh\" (UniqueName: \"kubernetes.io/projected/0171e721-a233-4112-ac5b-503a1aef22eb-kube-api-access-gg9mh\") pod \"heat-operator-controller-manager-77b6666d85-vxz75\" (UID: \"0171e721-a233-4112-ac5b-503a1aef22eb\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.318497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-ddxzh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.339193 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.402395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.403840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.407522 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.408718 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.411118 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.411771 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418806 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgbzr\" (UniqueName: \"kubernetes.io/projected/69ee85e7-4d8f-493c-8480-6eefec2091ae-kube-api-access-kgbzr\") pod \"cinder-operator-controller-manager-984cd4dcf-8tpdc\" (UID: \"69ee85e7-4d8f-493c-8480-6eefec2091ae\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96kr\" (UniqueName: \"kubernetes.io/projected/d6c4189b-47e3-41a5-83b6-2e6673f8d595-kube-api-access-s96kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-h9tr6\" (UID: \"d6c4189b-47e3-41a5-83b6-2e6673f8d595\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4md\" (UniqueName: \"kubernetes.io/projected/a8f94485-8cb5-43ab-b30d-2ad0b0a7836a-kube-api-access-7t4md\") pod \"designate-operator-controller-manager-66d56f6ff4-tlmlb\" (UID: \"a8f94485-8cb5-43ab-b30d-2ad0b0a7836a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpqkx\" (UniqueName: \"kubernetes.io/projected/85d0556d-e44b-4a30-a0f9-076e356bceef-kube-api-access-wpqkx\") pod \"manila-operator-controller-manager-68f45f9d9f-8k7b2\" (UID: \"85d0556d-e44b-4a30-a0f9-076e356bceef\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg9mh\" (UniqueName: \"kubernetes.io/projected/0171e721-a233-4112-ac5b-503a1aef22eb-kube-api-access-gg9mh\") pod \"heat-operator-controller-manager-77b6666d85-vxz75\" (UID: \"0171e721-a233-4112-ac5b-503a1aef22eb\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418974 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.418994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl8ql\" (UniqueName: \"kubernetes.io/projected/d8e67a8d-0858-4177-b9a0-fa1ba281424c-kube-api-access-pl8ql\") pod \"horizon-operator-controller-manager-6d9d6b584d-kcdqn\" (UID: \"d8e67a8d-0858-4177-b9a0-fa1ba281424c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.419018 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cwc6\" (UniqueName: \"kubernetes.io/projected/9968b76f-ff42-4d0e-9096-a229cf314dcb-kube-api-access-9cwc6\") pod \"keystone-operator-controller-manager-684f77d66d-xhvcm\" (UID: \"9968b76f-ff42-4d0e-9096-a229cf314dcb\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.419039 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qhmz\" (UniqueName: \"kubernetes.io/projected/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-kube-api-access-5qhmz\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.419086 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9lv\" (UniqueName: \"kubernetes.io/projected/552bb17b-df18-40ea-8688-f5f5c16e7c5b-kube-api-access-fw9lv\") pod \"glance-operator-controller-manager-5964f64c48-qhk6r\" (UID: \"552bb17b-df18-40ea-8688-f5f5c16e7c5b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.420314 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.423943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-7mg5m" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.424125 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-78f2d" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.430481 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.431296 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.433647 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-lkcg9" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.438739 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.456615 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgbzr\" (UniqueName: \"kubernetes.io/projected/69ee85e7-4d8f-493c-8480-6eefec2091ae-kube-api-access-kgbzr\") pod \"cinder-operator-controller-manager-984cd4dcf-8tpdc\" (UID: \"69ee85e7-4d8f-493c-8480-6eefec2091ae\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.459020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4md\" (UniqueName: \"kubernetes.io/projected/a8f94485-8cb5-43ab-b30d-2ad0b0a7836a-kube-api-access-7t4md\") pod \"designate-operator-controller-manager-66d56f6ff4-tlmlb\" (UID: \"a8f94485-8cb5-43ab-b30d-2ad0b0a7836a\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.461388 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.466960 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg9mh\" (UniqueName: \"kubernetes.io/projected/0171e721-a233-4112-ac5b-503a1aef22eb-kube-api-access-gg9mh\") pod \"heat-operator-controller-manager-77b6666d85-vxz75\" (UID: \"0171e721-a233-4112-ac5b-503a1aef22eb\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.467445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9lv\" (UniqueName: \"kubernetes.io/projected/552bb17b-df18-40ea-8688-f5f5c16e7c5b-kube-api-access-fw9lv\") pod \"glance-operator-controller-manager-5964f64c48-qhk6r\" (UID: \"552bb17b-df18-40ea-8688-f5f5c16e7c5b\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.480158 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.483934 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.488409 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.525812 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl8ql\" (UniqueName: \"kubernetes.io/projected/d8e67a8d-0858-4177-b9a0-fa1ba281424c-kube-api-access-pl8ql\") pod \"horizon-operator-controller-manager-6d9d6b584d-kcdqn\" (UID: \"d8e67a8d-0858-4177-b9a0-fa1ba281424c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526311 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cwc6\" (UniqueName: \"kubernetes.io/projected/9968b76f-ff42-4d0e-9096-a229cf314dcb-kube-api-access-9cwc6\") pod \"keystone-operator-controller-manager-684f77d66d-xhvcm\" (UID: \"9968b76f-ff42-4d0e-9096-a229cf314dcb\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qhmz\" (UniqueName: \"kubernetes.io/projected/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-kube-api-access-5qhmz\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526457 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzxmd\" (UniqueName: \"kubernetes.io/projected/1d435dd6-f95b-4883-928d-d010f897bb68-kube-api-access-jzxmd\") pod \"mariadb-operator-controller-manager-658d4cdd5-7vht5\" (UID: \"1d435dd6-f95b-4883-928d-d010f897bb68\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526609 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s96kr\" (UniqueName: \"kubernetes.io/projected/d6c4189b-47e3-41a5-83b6-2e6673f8d595-kube-api-access-s96kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-h9tr6\" (UID: \"d6c4189b-47e3-41a5-83b6-2e6673f8d595\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpqkx\" (UniqueName: \"kubernetes.io/projected/85d0556d-e44b-4a30-a0f9-076e356bceef-kube-api-access-wpqkx\") pod \"manila-operator-controller-manager-68f45f9d9f-8k7b2\" (UID: \"85d0556d-e44b-4a30-a0f9-076e356bceef\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:11 crc kubenswrapper[4795]: E0310 15:23:11.526733 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:11 crc kubenswrapper[4795]: E0310 15:23:11.526888 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert podName:ec0bfd76-f8c1-48a9-b35b-6307d31446e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:12.026815172 +0000 UTC m=+1025.192556070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert") pod "infra-operator-controller-manager-5995f4446f-4vlhk" (UID: "ec0bfd76-f8c1-48a9-b35b-6307d31446e6") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.526755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktgdr\" (UniqueName: \"kubernetes.io/projected/9aefc86f-37f5-4056-9e78-0eb01103e984-kube-api-access-ktgdr\") pod \"neutron-operator-controller-manager-776c5696bf-sz4bh\" (UID: \"9aefc86f-37f5-4056-9e78-0eb01103e984\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.527424 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkwwx\" (UniqueName: \"kubernetes.io/projected/d3b35266-d392-4d69-8ba2-471d69708706-kube-api-access-qkwwx\") pod \"nova-operator-controller-manager-569cc54c5-pfmkh\" (UID: \"d3b35266-d392-4d69-8ba2-471d69708706\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.546137 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.546949 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.549126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-klvth" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.551440 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.551599 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l9jjn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.564585 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96kr\" (UniqueName: \"kubernetes.io/projected/d6c4189b-47e3-41a5-83b6-2e6673f8d595-kube-api-access-s96kr\") pod \"ironic-operator-controller-manager-6bbb499bbc-h9tr6\" (UID: \"d6c4189b-47e3-41a5-83b6-2e6673f8d595\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.571675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl8ql\" (UniqueName: \"kubernetes.io/projected/d8e67a8d-0858-4177-b9a0-fa1ba281424c-kube-api-access-pl8ql\") pod \"horizon-operator-controller-manager-6d9d6b584d-kcdqn\" (UID: \"d8e67a8d-0858-4177-b9a0-fa1ba281424c\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.596859 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.599872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpqkx\" (UniqueName: \"kubernetes.io/projected/85d0556d-e44b-4a30-a0f9-076e356bceef-kube-api-access-wpqkx\") pod \"manila-operator-controller-manager-68f45f9d9f-8k7b2\" (UID: \"85d0556d-e44b-4a30-a0f9-076e356bceef\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.600394 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qhmz\" (UniqueName: \"kubernetes.io/projected/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-kube-api-access-5qhmz\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.602172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cwc6\" (UniqueName: \"kubernetes.io/projected/9968b76f-ff42-4d0e-9096-a229cf314dcb-kube-api-access-9cwc6\") pod \"keystone-operator-controller-manager-684f77d66d-xhvcm\" (UID: \"9968b76f-ff42-4d0e-9096-a229cf314dcb\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.612180 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.630855 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktgdr\" (UniqueName: \"kubernetes.io/projected/9aefc86f-37f5-4056-9e78-0eb01103e984-kube-api-access-ktgdr\") pod \"neutron-operator-controller-manager-776c5696bf-sz4bh\" (UID: \"9aefc86f-37f5-4056-9e78-0eb01103e984\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.630914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkwwx\" (UniqueName: \"kubernetes.io/projected/d3b35266-d392-4d69-8ba2-471d69708706-kube-api-access-qkwwx\") pod \"nova-operator-controller-manager-569cc54c5-pfmkh\" (UID: \"d3b35266-d392-4d69-8ba2-471d69708706\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.630982 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzxmd\" (UniqueName: \"kubernetes.io/projected/1d435dd6-f95b-4883-928d-d010f897bb68-kube-api-access-jzxmd\") pod \"mariadb-operator-controller-manager-658d4cdd5-7vht5\" (UID: \"1d435dd6-f95b-4883-928d-d010f897bb68\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.631013 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n69qj\" (UniqueName: \"kubernetes.io/projected/d7d04047-d616-4c35-a6f3-7767688d4393-kube-api-access-n69qj\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.631092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.631112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq7d9\" (UniqueName: \"kubernetes.io/projected/7ffe6f4f-ca5a-4b61-977c-1fcd22035674-kube-api-access-vq7d9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bjdwk\" (UID: \"7ffe6f4f-ca5a-4b61-977c-1fcd22035674\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.644640 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.650161 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.662175 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.668773 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.669907 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.670361 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktgdr\" (UniqueName: \"kubernetes.io/projected/9aefc86f-37f5-4056-9e78-0eb01103e984-kube-api-access-ktgdr\") pod \"neutron-operator-controller-manager-776c5696bf-sz4bh\" (UID: \"9aefc86f-37f5-4056-9e78-0eb01103e984\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.677150 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-86lrv" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.677280 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.677444 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-49fwq" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.678043 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.679014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzxmd\" (UniqueName: \"kubernetes.io/projected/1d435dd6-f95b-4883-928d-d010f897bb68-kube-api-access-jzxmd\") pod \"mariadb-operator-controller-manager-658d4cdd5-7vht5\" (UID: \"1d435dd6-f95b-4883-928d-d010f897bb68\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.680255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-72gh5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.680393 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.683728 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.683914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkwwx\" (UniqueName: \"kubernetes.io/projected/d3b35266-d392-4d69-8ba2-471d69708706-kube-api-access-qkwwx\") pod \"nova-operator-controller-manager-569cc54c5-pfmkh\" (UID: \"d3b35266-d392-4d69-8ba2-471d69708706\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.692289 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.692573 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.698574 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.700173 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vvrfw" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.704981 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.711789 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.727081 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.728177 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.745522 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-qhtfr" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.745944 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.755701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n69qj\" (UniqueName: \"kubernetes.io/projected/d7d04047-d616-4c35-a6f3-7767688d4393-kube-api-access-n69qj\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.755780 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc5wr\" (UniqueName: \"kubernetes.io/projected/b01360ed-92af-4626-a226-7cf86bdd51e1-kube-api-access-cc5wr\") pod \"ovn-operator-controller-manager-bbc5b68f9-tglhw\" (UID: \"b01360ed-92af-4626-a226-7cf86bdd51e1\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.755809 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.755829 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq7d9\" (UniqueName: \"kubernetes.io/projected/7ffe6f4f-ca5a-4b61-977c-1fcd22035674-kube-api-access-vq7d9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bjdwk\" (UID: \"7ffe6f4f-ca5a-4b61-977c-1fcd22035674\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.820826 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m"] Mar 10 15:23:11 crc kubenswrapper[4795]: E0310 15:23:11.829493 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:11 crc kubenswrapper[4795]: E0310 15:23:11.829561 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert podName:d7d04047-d616-4c35-a6f3-7767688d4393 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:12.329538656 +0000 UTC m=+1025.495279554 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" (UID: "d7d04047-d616-4c35-a6f3-7767688d4393") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.848951 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.866771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n69qj\" (UniqueName: \"kubernetes.io/projected/d7d04047-d616-4c35-a6f3-7767688d4393-kube-api-access-n69qj\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.869578 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.870250 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.871027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq7d9\" (UniqueName: \"kubernetes.io/projected/7ffe6f4f-ca5a-4b61-977c-1fcd22035674-kube-api-access-vq7d9\") pod \"octavia-operator-controller-manager-5f4f55cb5c-bjdwk\" (UID: \"7ffe6f4f-ca5a-4b61-977c-1fcd22035674\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.876393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.879151 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.883167 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xdhfm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.925946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncmvf\" (UniqueName: \"kubernetes.io/projected/7a87e459-bb41-405f-8fea-040c8a223373-kube-api-access-ncmvf\") pod \"swift-operator-controller-manager-677c674df7-qz5k7\" (UID: \"7a87e459-bb41-405f-8fea-040c8a223373\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.926002 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8s7w\" (UniqueName: \"kubernetes.io/projected/a4e07d9c-566b-4d59-869a-2d3720455624-kube-api-access-t8s7w\") pod \"test-operator-controller-manager-5c5cb9c4d7-zsx7m\" (UID: \"a4e07d9c-566b-4d59-869a-2d3720455624\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.926054 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8vw\" (UniqueName: \"kubernetes.io/projected/23620549-aa69-4e1b-bfb4-e335532a318c-kube-api-access-7r8vw\") pod \"placement-operator-controller-manager-574d45c66c-kd79t\" (UID: \"23620549-aa69-4e1b-bfb4-e335532a318c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.926126 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7z4\" (UniqueName: \"kubernetes.io/projected/5242d539-a4e7-4a4c-b485-e8c43ce52546-kube-api-access-xf7z4\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-5xwgv\" (UID: \"5242d539-a4e7-4a4c-b485-e8c43ce52546\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.926153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc5wr\" (UniqueName: \"kubernetes.io/projected/b01360ed-92af-4626-a226-7cf86bdd51e1-kube-api-access-cc5wr\") pod \"ovn-operator-controller-manager-bbc5b68f9-tglhw\" (UID: \"b01360ed-92af-4626-a226-7cf86bdd51e1\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.929342 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.936117 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.939852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.940179 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.940388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-4rchn" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.940980 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.953972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.961319 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.962286 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.965392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc5wr\" (UniqueName: \"kubernetes.io/projected/b01360ed-92af-4626-a226-7cf86bdd51e1-kube-api-access-cc5wr\") pod \"ovn-operator-controller-manager-bbc5b68f9-tglhw\" (UID: \"b01360ed-92af-4626-a226-7cf86bdd51e1\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.967006 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-k6bdg" Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.970215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm"] Mar 10 15:23:11 crc kubenswrapper[4795]: I0310 15:23:11.980839 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.002387 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027618 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027669 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6fwk\" (UniqueName: \"kubernetes.io/projected/44fbebe4-6a17-4378-9f39-bda40adb7e02-kube-api-access-m6fwk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hkmpm\" (UID: \"44fbebe4-6a17-4378-9f39-bda40adb7e02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7z4\" (UniqueName: \"kubernetes.io/projected/5242d539-a4e7-4a4c-b485-e8c43ce52546-kube-api-access-xf7z4\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-5xwgv\" (UID: \"5242d539-a4e7-4a4c-b485-e8c43ce52546\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r6n4\" (UniqueName: \"kubernetes.io/projected/69a31d53-90dd-46ca-a5ee-8841b89445e6-kube-api-access-7r6n4\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027810 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncmvf\" (UniqueName: \"kubernetes.io/projected/7a87e459-bb41-405f-8fea-040c8a223373-kube-api-access-ncmvf\") pod \"swift-operator-controller-manager-677c674df7-qz5k7\" (UID: \"7a87e459-bb41-405f-8fea-040c8a223373\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027832 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8s7w\" (UniqueName: \"kubernetes.io/projected/a4e07d9c-566b-4d59-869a-2d3720455624-kube-api-access-t8s7w\") pod \"test-operator-controller-manager-5c5cb9c4d7-zsx7m\" (UID: \"a4e07d9c-566b-4d59-869a-2d3720455624\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027860 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz7qc\" (UniqueName: \"kubernetes.io/projected/f23149ba-6bcc-49ac-93fc-60092174c5a8-kube-api-access-hz7qc\") pod \"watcher-operator-controller-manager-6dd88c6f67-fl2gq\" (UID: \"f23149ba-6bcc-49ac-93fc-60092174c5a8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.027910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8vw\" (UniqueName: \"kubernetes.io/projected/23620549-aa69-4e1b-bfb4-e335532a318c-kube-api-access-7r8vw\") pod \"placement-operator-controller-manager-574d45c66c-kd79t\" (UID: \"23620549-aa69-4e1b-bfb4-e335532a318c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.028259 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.028399 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert podName:ec0bfd76-f8c1-48a9-b35b-6307d31446e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:13.028374067 +0000 UTC m=+1026.194114975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert") pod "infra-operator-controller-manager-5995f4446f-4vlhk" (UID: "ec0bfd76-f8c1-48a9-b35b-6307d31446e6") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.052953 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8s7w\" (UniqueName: \"kubernetes.io/projected/a4e07d9c-566b-4d59-869a-2d3720455624-kube-api-access-t8s7w\") pod \"test-operator-controller-manager-5c5cb9c4d7-zsx7m\" (UID: \"a4e07d9c-566b-4d59-869a-2d3720455624\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.058276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8vw\" (UniqueName: \"kubernetes.io/projected/23620549-aa69-4e1b-bfb4-e335532a318c-kube-api-access-7r8vw\") pod \"placement-operator-controller-manager-574d45c66c-kd79t\" (UID: \"23620549-aa69-4e1b-bfb4-e335532a318c\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.058374 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncmvf\" (UniqueName: \"kubernetes.io/projected/7a87e459-bb41-405f-8fea-040c8a223373-kube-api-access-ncmvf\") pod \"swift-operator-controller-manager-677c674df7-qz5k7\" (UID: \"7a87e459-bb41-405f-8fea-040c8a223373\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.060509 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7z4\" (UniqueName: \"kubernetes.io/projected/5242d539-a4e7-4a4c-b485-e8c43ce52546-kube-api-access-xf7z4\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-5xwgv\" (UID: \"5242d539-a4e7-4a4c-b485-e8c43ce52546\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.061974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.125652 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.129046 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6fwk\" (UniqueName: \"kubernetes.io/projected/44fbebe4-6a17-4378-9f39-bda40adb7e02-kube-api-access-m6fwk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hkmpm\" (UID: \"44fbebe4-6a17-4378-9f39-bda40adb7e02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.129137 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.129167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r6n4\" (UniqueName: \"kubernetes.io/projected/69a31d53-90dd-46ca-a5ee-8841b89445e6-kube-api-access-7r6n4\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.129522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz7qc\" (UniqueName: \"kubernetes.io/projected/f23149ba-6bcc-49ac-93fc-60092174c5a8-kube-api-access-hz7qc\") pod \"watcher-operator-controller-manager-6dd88c6f67-fl2gq\" (UID: \"f23149ba-6bcc-49ac-93fc-60092174c5a8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.129684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.129877 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.129931 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:12.629911682 +0000 UTC m=+1025.795652580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.129934 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.129980 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:12.629966503 +0000 UTC m=+1025.795707401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.130247 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.147125 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz7qc\" (UniqueName: \"kubernetes.io/projected/f23149ba-6bcc-49ac-93fc-60092174c5a8-kube-api-access-hz7qc\") pod \"watcher-operator-controller-manager-6dd88c6f67-fl2gq\" (UID: \"f23149ba-6bcc-49ac-93fc-60092174c5a8\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.150533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r6n4\" (UniqueName: \"kubernetes.io/projected/69a31d53-90dd-46ca-a5ee-8841b89445e6-kube-api-access-7r6n4\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.151140 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6fwk\" (UniqueName: \"kubernetes.io/projected/44fbebe4-6a17-4378-9f39-bda40adb7e02-kube-api-access-m6fwk\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hkmpm\" (UID: \"44fbebe4-6a17-4378-9f39-bda40adb7e02\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.156733 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.198701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.226497 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.247558 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.289718 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.294087 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.308811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" event={"ID":"c5c14124-2fc6-4052-b12a-81336c47ae33","Type":"ContainerStarted","Data":"c2b6f796797ab9e8b137530f64bfa977258cc5072745089ace5161d08f641642"} Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.321014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.332100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.332273 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.332361 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert podName:d7d04047-d616-4c35-a6f3-7767688d4393 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:13.332336955 +0000 UTC m=+1026.498077863 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" (UID: "d7d04047-d616-4c35-a6f3-7767688d4393") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.345238 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552bb17b_df18_40ea_8688_f5f5c16e7c5b.slice/crio-27699b9deea9605a5fcb5e191839058bd5fff8ced603326a9cb5783ed1812246 WatchSource:0}: Error finding container 27699b9deea9605a5fcb5e191839058bd5fff8ced603326a9cb5783ed1812246: Status 404 returned error can't find the container with id 27699b9deea9605a5fcb5e191839058bd5fff8ced603326a9cb5783ed1812246 Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.349926 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8f94485_8cb5_43ab_b30d_2ad0b0a7836a.slice/crio-b71cfdc034d1dc3b40d6e2334e8f953b4c575b78c20b188b5cc372a946bf5678 WatchSource:0}: Error finding container b71cfdc034d1dc3b40d6e2334e8f953b4c575b78c20b188b5cc372a946bf5678: Status 404 returned error can't find the container with id b71cfdc034d1dc3b40d6e2334e8f953b4c575b78c20b188b5cc372a946bf5678 Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.532671 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.543875 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6"] Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.567755 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c4189b_47e3_41a5_83b6_2e6673f8d595.slice/crio-f8bc9c01946104f518cafb54ed4ee1120686365194a0e72ce478e64e26ef58b3 WatchSource:0}: Error finding container f8bc9c01946104f518cafb54ed4ee1120686365194a0e72ce478e64e26ef58b3: Status 404 returned error can't find the container with id f8bc9c01946104f518cafb54ed4ee1120686365194a0e72ce478e64e26ef58b3 Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.636692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.636777 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.636923 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.636987 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:13.636969343 +0000 UTC m=+1026.802710241 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.637050 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.637138 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:13.637115008 +0000 UTC m=+1026.802855906 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.697700 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.701825 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.711084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh"] Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.713022 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ee85e7_4d8f_493c_8480_6eefec2091ae.slice/crio-3d3f3197dd697e8e2b81ef0c9b6e1e3ace6596336881bc698f93efd87ed90f69 WatchSource:0}: Error finding container 3d3f3197dd697e8e2b81ef0c9b6e1e3ace6596336881bc698f93efd87ed90f69: Status 404 returned error can't find the container with id 3d3f3197dd697e8e2b81ef0c9b6e1e3ace6596336881bc698f93efd87ed90f69 Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.718975 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.729842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.737332 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.880178 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.886097 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.904154 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw"] Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.914765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh"] Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.922974 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3b35266_d392_4d69_8ba2_471d69708706.slice/crio-5c87c36258d7cf1dea41e6bb93a2f401095d2e19aabb002500860e3c68699772 WatchSource:0}: Error finding container 5c87c36258d7cf1dea41e6bb93a2f401095d2e19aabb002500860e3c68699772: Status 404 returned error can't find the container with id 5c87c36258d7cf1dea41e6bb93a2f401095d2e19aabb002500860e3c68699772 Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.924586 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb01360ed_92af_4626_a226_7cf86bdd51e1.slice/crio-8b16cec390c8bd666b803cf21eae23473387857567c9ca1aef590c506736eb34 WatchSource:0}: Error finding container 8b16cec390c8bd666b803cf21eae23473387857567c9ca1aef590c506736eb34: Status 404 returned error can't find the container with id 8b16cec390c8bd666b803cf21eae23473387857567c9ca1aef590c506736eb34 Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.925499 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qkwwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-569cc54c5-pfmkh_openstack-operators(d3b35266-d392-4d69-8ba2-471d69708706): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.926663 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" podUID="d3b35266-d392-4d69-8ba2-471d69708706" Mar 10 15:23:12 crc kubenswrapper[4795]: W0310 15:23:12.926722 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a87e459_bb41_405f_8fea_040c8a223373.slice/crio-70dc72748620e0ce13f109eef4c0033964b6b2087bedeccf0111311df29e5a24 WatchSource:0}: Error finding container 70dc72748620e0ce13f109eef4c0033964b6b2087bedeccf0111311df29e5a24: Status 404 returned error can't find the container with id 70dc72748620e0ce13f109eef4c0033964b6b2087bedeccf0111311df29e5a24 Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.928139 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cc5wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-tglhw_openstack-operators(b01360ed-92af-4626-a226-7cf86bdd51e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.929384 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" podUID="b01360ed-92af-4626-a226-7cf86bdd51e1" Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.929498 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ncmvf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-677c674df7-qz5k7_openstack-operators(7a87e459-bb41-405f-8fea-040c8a223373): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:12 crc kubenswrapper[4795]: I0310 15:23:12.930280 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7"] Mar 10 15:23:12 crc kubenswrapper[4795]: E0310 15:23:12.930994 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" podUID="7a87e459-bb41-405f-8fea-040c8a223373" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.041900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.042056 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.042148 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert podName:ec0bfd76-f8c1-48a9-b35b-6307d31446e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:15.042125369 +0000 UTC m=+1028.207866347 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert") pod "infra-operator-controller-manager-5995f4446f-4vlhk" (UID: "ec0bfd76-f8c1-48a9-b35b-6307d31446e6") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.103051 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm"] Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.115516 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xf7z4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6cd66dbd4b-5xwgv_openstack-operators(5242d539-a4e7-4a4c-b485-e8c43ce52546): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.116886 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" podUID="5242d539-a4e7-4a4c-b485-e8c43ce52546" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.121902 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv"] Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.130439 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m"] Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.130422 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-t8s7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-zsx7m_openstack-operators(a4e07d9c-566b-4d59-869a-2d3720455624): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.133153 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" podUID="a4e07d9c-566b-4d59-869a-2d3720455624" Mar 10 15:23:13 crc kubenswrapper[4795]: W0310 15:23:13.133905 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf23149ba_6bcc_49ac_93fc_60092174c5a8.slice/crio-cb58a0c26daaf44c4ece427e61193f6d3f93ff28c8a686a72149c79c61c9fa71 WatchSource:0}: Error finding container cb58a0c26daaf44c4ece427e61193f6d3f93ff28c8a686a72149c79c61c9fa71: Status 404 returned error can't find the container with id cb58a0c26daaf44c4ece427e61193f6d3f93ff28c8a686a72149c79c61c9fa71 Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.135632 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq"] Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.142087 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hz7qc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6dd88c6f67-fl2gq_openstack-operators(f23149ba-6bcc-49ac-93fc-60092174c5a8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.143281 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" podUID="f23149ba-6bcc-49ac-93fc-60092174c5a8" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.317085 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" event={"ID":"69ee85e7-4d8f-493c-8480-6eefec2091ae","Type":"ContainerStarted","Data":"3d3f3197dd697e8e2b81ef0c9b6e1e3ace6596336881bc698f93efd87ed90f69"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.318869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" event={"ID":"d3b35266-d392-4d69-8ba2-471d69708706","Type":"ContainerStarted","Data":"5c87c36258d7cf1dea41e6bb93a2f401095d2e19aabb002500860e3c68699772"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.320032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" event={"ID":"5242d539-a4e7-4a4c-b485-e8c43ce52546","Type":"ContainerStarted","Data":"833527c070eadd8c8c9f47adb3809fc5edc86c8effecfb60f967b8e6f96359d2"} Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.320417 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" podUID="d3b35266-d392-4d69-8ba2-471d69708706" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.321240 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" podUID="5242d539-a4e7-4a4c-b485-e8c43ce52546" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.321438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" event={"ID":"9aefc86f-37f5-4056-9e78-0eb01103e984","Type":"ContainerStarted","Data":"4e5464ff416332338b6c25c69c55e3fedba25e52530612a4c1218b6a18298c7b"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.324362 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" event={"ID":"1d435dd6-f95b-4883-928d-d010f897bb68","Type":"ContainerStarted","Data":"2aefd9715ca4266061e152201541cd1ba5436ae3b9e84d17859121801c297fc8"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.325515 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" event={"ID":"44fbebe4-6a17-4378-9f39-bda40adb7e02","Type":"ContainerStarted","Data":"399a243937bebbc22a384a9cc69c440c928265981615c3b4e1933eb93ab68804"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.326551 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" event={"ID":"d8e67a8d-0858-4177-b9a0-fa1ba281424c","Type":"ContainerStarted","Data":"4674fc7df4b9aa344bd8756fffb978f5daca9fdbf3b748898cddf150049fa7ec"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.327779 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" event={"ID":"a4e07d9c-566b-4d59-869a-2d3720455624","Type":"ContainerStarted","Data":"96d89b8501da3991d3834f50ea530354e616e33d899ff38adfa370fd35765b10"} Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.329934 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" podUID="a4e07d9c-566b-4d59-869a-2d3720455624" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.330691 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" event={"ID":"552bb17b-df18-40ea-8688-f5f5c16e7c5b","Type":"ContainerStarted","Data":"27699b9deea9605a5fcb5e191839058bd5fff8ced603326a9cb5783ed1812246"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.331937 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" event={"ID":"85d0556d-e44b-4a30-a0f9-076e356bceef","Type":"ContainerStarted","Data":"dd53e8c801f6e2d3a1dbbd497aefeae5b655d5a41c88555f0c010d250fdd4dc9"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.334595 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" event={"ID":"b01360ed-92af-4626-a226-7cf86bdd51e1","Type":"ContainerStarted","Data":"8b16cec390c8bd666b803cf21eae23473387857567c9ca1aef590c506736eb34"} Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.336824 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" podUID="b01360ed-92af-4626-a226-7cf86bdd51e1" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.339489 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" event={"ID":"7ffe6f4f-ca5a-4b61-977c-1fcd22035674","Type":"ContainerStarted","Data":"843a581e2fd6d477274bca575ab639d33a2ff103f8bd57d17cb806cce2245627"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.342473 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" event={"ID":"23620549-aa69-4e1b-bfb4-e335532a318c","Type":"ContainerStarted","Data":"7f4424006c496a8c8818b40d66e8ab0c30a6d031337f7ddbe0624d7c925ea988"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.344491 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" event={"ID":"f23149ba-6bcc-49ac-93fc-60092174c5a8","Type":"ContainerStarted","Data":"cb58a0c26daaf44c4ece427e61193f6d3f93ff28c8a686a72149c79c61c9fa71"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.346180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.347287 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" podUID="f23149ba-6bcc-49ac-93fc-60092174c5a8" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.353386 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.353510 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert podName:d7d04047-d616-4c35-a6f3-7767688d4393 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:15.353474179 +0000 UTC m=+1028.519215117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" (UID: "d7d04047-d616-4c35-a6f3-7767688d4393") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.354844 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" event={"ID":"7a87e459-bb41-405f-8fea-040c8a223373","Type":"ContainerStarted","Data":"70dc72748620e0ce13f109eef4c0033964b6b2087bedeccf0111311df29e5a24"} Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.357351 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" podUID="7a87e459-bb41-405f-8fea-040c8a223373" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.359115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" event={"ID":"d6c4189b-47e3-41a5-83b6-2e6673f8d595","Type":"ContainerStarted","Data":"f8bc9c01946104f518cafb54ed4ee1120686365194a0e72ce478e64e26ef58b3"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.361942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" event={"ID":"a8f94485-8cb5-43ab-b30d-2ad0b0a7836a","Type":"ContainerStarted","Data":"b71cfdc034d1dc3b40d6e2334e8f953b4c575b78c20b188b5cc372a946bf5678"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.363175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" event={"ID":"9968b76f-ff42-4d0e-9096-a229cf314dcb","Type":"ContainerStarted","Data":"55274c3145c0822058325b59536b6879568918e311519e6a8fe6f72002612944"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.363992 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" event={"ID":"0171e721-a233-4112-ac5b-503a1aef22eb","Type":"ContainerStarted","Data":"782acd942b9ff0c367c847659b62e3d13fbed15435ea35de06dd2c7a206133f1"} Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.650228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:13 crc kubenswrapper[4795]: I0310 15:23:13.650383 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.650454 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.650501 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.650518 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:15.6504993 +0000 UTC m=+1028.816240198 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:13 crc kubenswrapper[4795]: E0310 15:23:13.650543 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:15.650533461 +0000 UTC m=+1028.816274359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.381976 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:27c84b712abc2df6108e22636075eec25fea0229800f38594a492fd41b02c49d\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" podUID="5242d539-a4e7-4a4c-b485-e8c43ce52546" Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.382324 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:2bd37bdd917e3abe72613a734ce5021330242ec8cae9b8da76c57a0765152922\\\"\"" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" podUID="d3b35266-d392-4d69-8ba2-471d69708706" Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.382362 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" podUID="a4e07d9c-566b-4d59-869a-2d3720455624" Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.382393 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" podUID="b01360ed-92af-4626-a226-7cf86bdd51e1" Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.382431 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:4af709a2a6a1a1abb9659dbdd6fb3818122bdec7e66009fcced0bf0949f91554\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" podUID="f23149ba-6bcc-49ac-93fc-60092174c5a8" Mar 10 15:23:14 crc kubenswrapper[4795]: E0310 15:23:14.382462 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:c223309f51714785bd878ad04080f7428567edad793be4f992d492abd77af44c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" podUID="7a87e459-bb41-405f-8fea-040c8a223373" Mar 10 15:23:15 crc kubenswrapper[4795]: I0310 15:23:15.069937 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.070132 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.070323 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert podName:ec0bfd76-f8c1-48a9-b35b-6307d31446e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:19.070306766 +0000 UTC m=+1032.236047654 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert") pod "infra-operator-controller-manager-5995f4446f-4vlhk" (UID: "ec0bfd76-f8c1-48a9-b35b-6307d31446e6") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.376335 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.376495 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert podName:d7d04047-d616-4c35-a6f3-7767688d4393 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:19.376456347 +0000 UTC m=+1032.542197235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" (UID: "d7d04047-d616-4c35-a6f3-7767688d4393") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: I0310 15:23:15.378469 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:15 crc kubenswrapper[4795]: I0310 15:23:15.713125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:15 crc kubenswrapper[4795]: I0310 15:23:15.713369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.713597 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.713663 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:19.713642493 +0000 UTC m=+1032.879383411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.714103 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:15 crc kubenswrapper[4795]: E0310 15:23:15.714140 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:19.714131107 +0000 UTC m=+1032.879872005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:18 crc kubenswrapper[4795]: I0310 15:23:18.539519 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:23:18 crc kubenswrapper[4795]: I0310 15:23:18.539858 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:23:18 crc kubenswrapper[4795]: I0310 15:23:18.539909 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:23:18 crc kubenswrapper[4795]: I0310 15:23:18.540625 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:23:18 crc kubenswrapper[4795]: I0310 15:23:18.540720 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d" gracePeriod=600 Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.159678 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.159839 4795 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.159892 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert podName:ec0bfd76-f8c1-48a9-b35b-6307d31446e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:27.159875893 +0000 UTC m=+1040.325616791 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert") pod "infra-operator-controller-manager-5995f4446f-4vlhk" (UID: "ec0bfd76-f8c1-48a9-b35b-6307d31446e6") : secret "infra-operator-webhook-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.436295 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d" exitCode=0 Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.436340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d"} Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.436376 4795 scope.go:117] "RemoveContainer" containerID="763f5a84ea79ca41e852b7c467dce5e81209d9a31a115f8701cd8dd045d46d0f" Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.464333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.464518 4795 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.464592 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert podName:d7d04047-d616-4c35-a6f3-7767688d4393 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:27.464572342 +0000 UTC m=+1040.630313240 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" (UID: "d7d04047-d616-4c35-a6f3-7767688d4393") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.768265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:19 crc kubenswrapper[4795]: I0310 15:23:19.768345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.768465 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.768489 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.768542 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:27.768525871 +0000 UTC m=+1040.934266759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:19 crc kubenswrapper[4795]: E0310 15:23:19.768572 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:27.768551062 +0000 UTC m=+1040.934291980 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.177085 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.195286 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ec0bfd76-f8c1-48a9-b35b-6307d31446e6-cert\") pod \"infra-operator-controller-manager-5995f4446f-4vlhk\" (UID: \"ec0bfd76-f8c1-48a9-b35b-6307d31446e6\") " pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.466364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.481274 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.484457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d7d04047-d616-4c35-a6f3-7767688d4393-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb\" (UID: \"d7d04047-d616-4c35-a6f3-7767688d4393\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.621524 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.789183 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:27 crc kubenswrapper[4795]: I0310 15:23:27.789273 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.789394 4795 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.789493 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:43.789463875 +0000 UTC m=+1056.955204783 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "metrics-server-cert" not found Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.789402 4795 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.789602 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs podName:69a31d53-90dd-46ca-a5ee-8841b89445e6 nodeName:}" failed. No retries permitted until 2026-03-10 15:23:43.789575498 +0000 UTC m=+1056.955316396 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs") pod "openstack-operator-controller-manager-76d6f6bb5f-26dcx" (UID: "69a31d53-90dd-46ca-a5ee-8841b89445e6") : secret "webhook-server-cert" not found Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.851155 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60" Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.851325 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fw9lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5964f64c48-qhk6r_openstack-operators(552bb17b-df18-40ea-8688-f5f5c16e7c5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:23:27 crc kubenswrapper[4795]: E0310 15:23:27.852473 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" podUID="552bb17b-df18-40ea-8688-f5f5c16e7c5b" Mar 10 15:23:28 crc kubenswrapper[4795]: E0310 15:23:28.517832 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" podUID="552bb17b-df18-40ea-8688-f5f5c16e7c5b" Mar 10 15:23:28 crc kubenswrapper[4795]: E0310 15:23:28.575135 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 10 15:23:28 crc kubenswrapper[4795]: E0310 15:23:28.575353 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9cwc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-xhvcm_openstack-operators(9968b76f-ff42-4d0e-9096-a229cf314dcb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:23:28 crc kubenswrapper[4795]: E0310 15:23:28.576531 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" podUID="9968b76f-ff42-4d0e-9096-a229cf314dcb" Mar 10 15:23:29 crc kubenswrapper[4795]: E0310 15:23:29.248298 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 10 15:23:29 crc kubenswrapper[4795]: E0310 15:23:29.248528 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-m6fwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hkmpm_openstack-operators(44fbebe4-6a17-4378-9f39-bda40adb7e02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:23:29 crc kubenswrapper[4795]: E0310 15:23:29.249854 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" podUID="44fbebe4-6a17-4378-9f39-bda40adb7e02" Mar 10 15:23:29 crc kubenswrapper[4795]: E0310 15:23:29.528294 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" podUID="9968b76f-ff42-4d0e-9096-a229cf314dcb" Mar 10 15:23:29 crc kubenswrapper[4795]: E0310 15:23:29.528309 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" podUID="44fbebe4-6a17-4378-9f39-bda40adb7e02" Mar 10 15:23:31 crc kubenswrapper[4795]: I0310 15:23:31.408539 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb"] Mar 10 15:23:32 crc kubenswrapper[4795]: W0310 15:23:32.292155 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7d04047_d616_4c35_a6f3_7767688d4393.slice/crio-e06b977c95b7ab81b442c210b7fde34c7372a2af6dbe155d681137f4d51d3ccd WatchSource:0}: Error finding container e06b977c95b7ab81b442c210b7fde34c7372a2af6dbe155d681137f4d51d3ccd: Status 404 returned error can't find the container with id e06b977c95b7ab81b442c210b7fde34c7372a2af6dbe155d681137f4d51d3ccd Mar 10 15:23:32 crc kubenswrapper[4795]: I0310 15:23:32.467483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk"] Mar 10 15:23:32 crc kubenswrapper[4795]: I0310 15:23:32.546562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" event={"ID":"d7d04047-d616-4c35-a6f3-7767688d4393","Type":"ContainerStarted","Data":"e06b977c95b7ab81b442c210b7fde34c7372a2af6dbe155d681137f4d51d3ccd"} Mar 10 15:23:33 crc kubenswrapper[4795]: W0310 15:23:33.049373 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec0bfd76_f8c1_48a9_b35b_6307d31446e6.slice/crio-226b52e14bceda67c670cbe6789997ca222c150dca28691a02303c62b6088a47 WatchSource:0}: Error finding container 226b52e14bceda67c670cbe6789997ca222c150dca28691a02303c62b6088a47: Status 404 returned error can't find the container with id 226b52e14bceda67c670cbe6789997ca222c150dca28691a02303c62b6088a47 Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.564304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" event={"ID":"0171e721-a233-4112-ac5b-503a1aef22eb","Type":"ContainerStarted","Data":"90c88af9ebe3093428290f6f962b8e18a2c50175f175f3ea3d9f3a3c677f4b93"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.565031 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.570577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.577328 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" event={"ID":"1d435dd6-f95b-4883-928d-d010f897bb68","Type":"ContainerStarted","Data":"72981e367a0826282c8f4c890102f9b3ece5a621de4bc3482430d9558ae4d8aa"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.577417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.581776 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" event={"ID":"d6c4189b-47e3-41a5-83b6-2e6673f8d595","Type":"ContainerStarted","Data":"42e0d08589f88a83b58be4493770ea41273a35f8e296dd48b402689d42c823b6"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.581978 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.592243 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" podStartSLOduration=5.951438936 podStartE2EDuration="22.592223626s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.552879745 +0000 UTC m=+1025.718620643" lastFinishedPulling="2026-03-10 15:23:29.193664425 +0000 UTC m=+1042.359405333" observedRunningTime="2026-03-10 15:23:33.587372728 +0000 UTC m=+1046.753113626" watchObservedRunningTime="2026-03-10 15:23:33.592223626 +0000 UTC m=+1046.757964524" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.599312 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" event={"ID":"7ffe6f4f-ca5a-4b61-977c-1fcd22035674","Type":"ContainerStarted","Data":"dc3c3fba14063646fb1ed25cd3455c3f2b8b5199d2635138587b50d3ef282d9f"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.599879 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.613907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" event={"ID":"ec0bfd76-f8c1-48a9-b35b-6307d31446e6","Type":"ContainerStarted","Data":"226b52e14bceda67c670cbe6789997ca222c150dca28691a02303c62b6088a47"} Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.631058 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" podStartSLOduration=6.015606498 podStartE2EDuration="22.631036915s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.577985801 +0000 UTC m=+1025.743726689" lastFinishedPulling="2026-03-10 15:23:29.193416218 +0000 UTC m=+1042.359157106" observedRunningTime="2026-03-10 15:23:33.612287399 +0000 UTC m=+1046.778028297" watchObservedRunningTime="2026-03-10 15:23:33.631036915 +0000 UTC m=+1046.796777813" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.665492 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" podStartSLOduration=6.179698801 podStartE2EDuration="22.665470849s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.709135482 +0000 UTC m=+1025.874876380" lastFinishedPulling="2026-03-10 15:23:29.19490753 +0000 UTC m=+1042.360648428" observedRunningTime="2026-03-10 15:23:33.649997357 +0000 UTC m=+1046.815738255" watchObservedRunningTime="2026-03-10 15:23:33.665470849 +0000 UTC m=+1046.831211747" Mar 10 15:23:33 crc kubenswrapper[4795]: I0310 15:23:33.684204 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" podStartSLOduration=6.398483961 podStartE2EDuration="22.684183383s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.906967044 +0000 UTC m=+1026.072707942" lastFinishedPulling="2026-03-10 15:23:29.192666466 +0000 UTC m=+1042.358407364" observedRunningTime="2026-03-10 15:23:33.679372146 +0000 UTC m=+1046.845113044" watchObservedRunningTime="2026-03-10 15:23:33.684183383 +0000 UTC m=+1046.849924301" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.626744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" event={"ID":"23620549-aa69-4e1b-bfb4-e335532a318c","Type":"ContainerStarted","Data":"f24e7ff549d45c300d2da55f1825f49eb8b227e813547c1ed43582882ce933ab"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.626795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.634688 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" event={"ID":"69ee85e7-4d8f-493c-8480-6eefec2091ae","Type":"ContainerStarted","Data":"29ba3868840cd2a786cff729a3c20cb46e632e0ab774dcef9918788193e3cddf"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.634808 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.641698 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" event={"ID":"5242d539-a4e7-4a4c-b485-e8c43ce52546","Type":"ContainerStarted","Data":"e18924838cc225670d43039c855bcf9e3c96b64de11e46e22fd4d3382345fe09"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.642046 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.646673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" event={"ID":"d8e67a8d-0858-4177-b9a0-fa1ba281424c","Type":"ContainerStarted","Data":"2c9a29b2ddbde4a56c2d755c77bc222b1a2075d23b7150b108e59e9ba9bd364a"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.647315 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.652687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" event={"ID":"7a87e459-bb41-405f-8fea-040c8a223373","Type":"ContainerStarted","Data":"f39c6bdfc340f5286140dd146b848ce5d84cb6e986e343b09906a3d1bdb5d961"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.653254 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.658196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" event={"ID":"a8f94485-8cb5-43ab-b30d-2ad0b0a7836a","Type":"ContainerStarted","Data":"f6a42082adc0b6b9b9a978f86016a6d20e27aa6ee2a0e2db8b4445f5bcab4a37"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.658233 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.665348 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" podStartSLOduration=7.378144321 podStartE2EDuration="23.665332936s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.906345996 +0000 UTC m=+1026.072086914" lastFinishedPulling="2026-03-10 15:23:29.193534631 +0000 UTC m=+1042.359275529" observedRunningTime="2026-03-10 15:23:34.646305213 +0000 UTC m=+1047.812046111" watchObservedRunningTime="2026-03-10 15:23:34.665332936 +0000 UTC m=+1047.831073834" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.666701 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" podStartSLOduration=3.681494563 podStartE2EDuration="23.666694975s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:13.115306496 +0000 UTC m=+1026.281047404" lastFinishedPulling="2026-03-10 15:23:33.100506918 +0000 UTC m=+1046.266247816" observedRunningTime="2026-03-10 15:23:34.662363722 +0000 UTC m=+1047.828104610" watchObservedRunningTime="2026-03-10 15:23:34.666694975 +0000 UTC m=+1047.832435873" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.672451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" event={"ID":"85d0556d-e44b-4a30-a0f9-076e356bceef","Type":"ContainerStarted","Data":"eb71b35f6e0c51a83fc7f07f54b73a7e46b9d48fc395732d5c161a854a5b0309"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.673147 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.683084 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" event={"ID":"a4e07d9c-566b-4d59-869a-2d3720455624","Type":"ContainerStarted","Data":"6ad368dde44cbe27240aea768ae3367e6a37946647a9c8c8bb13f3da1cb365cf"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.683740 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.686480 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" podStartSLOduration=7.209522753 podStartE2EDuration="23.68647033s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.71645517 +0000 UTC m=+1025.882196068" lastFinishedPulling="2026-03-10 15:23:29.193402747 +0000 UTC m=+1042.359143645" observedRunningTime="2026-03-10 15:23:34.683899667 +0000 UTC m=+1047.849640565" watchObservedRunningTime="2026-03-10 15:23:34.68647033 +0000 UTC m=+1047.852211228" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.703755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" event={"ID":"b01360ed-92af-4626-a226-7cf86bdd51e1","Type":"ContainerStarted","Data":"f9534258b5660120f8727ee5664f66874e70b3c845d3db0e228eae7e8f1fcac3"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.704288 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.710055 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" podStartSLOduration=6.879803891 podStartE2EDuration="23.710042594s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.361441285 +0000 UTC m=+1025.527182183" lastFinishedPulling="2026-03-10 15:23:29.191679988 +0000 UTC m=+1042.357420886" observedRunningTime="2026-03-10 15:23:34.708433288 +0000 UTC m=+1047.874174186" watchObservedRunningTime="2026-03-10 15:23:34.710042594 +0000 UTC m=+1047.875783492" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.712397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" event={"ID":"c5c14124-2fc6-4052-b12a-81336c47ae33","Type":"ContainerStarted","Data":"0e5e8a7e8506f9330f44c777f327ef955d4ca8abd2d847999b6ea57a27f2f762"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.712644 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.731908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" event={"ID":"d3b35266-d392-4d69-8ba2-471d69708706","Type":"ContainerStarted","Data":"2794f9b2b890865596de8e7e0b1bbf598b4a367245569e188f7ccba7e07e9ff6"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.732258 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.743818 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" podStartSLOduration=3.56450365 podStartE2EDuration="23.743798958s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.929377843 +0000 UTC m=+1026.095118741" lastFinishedPulling="2026-03-10 15:23:33.108673151 +0000 UTC m=+1046.274414049" observedRunningTime="2026-03-10 15:23:34.740998188 +0000 UTC m=+1047.906739086" watchObservedRunningTime="2026-03-10 15:23:34.743798958 +0000 UTC m=+1047.909539856" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.760722 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" event={"ID":"f23149ba-6bcc-49ac-93fc-60092174c5a8","Type":"ContainerStarted","Data":"33874fc4e806174f41d70013dea5235987c98af894c441ade6714b73b2e21e3e"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.761001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.773749 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" podStartSLOduration=3.820640148 podStartE2EDuration="23.773726883s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:13.130281093 +0000 UTC m=+1026.296022001" lastFinishedPulling="2026-03-10 15:23:33.083367838 +0000 UTC m=+1046.249108736" observedRunningTime="2026-03-10 15:23:34.769518043 +0000 UTC m=+1047.935258931" watchObservedRunningTime="2026-03-10 15:23:34.773726883 +0000 UTC m=+1047.939467781" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.780915 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" event={"ID":"9aefc86f-37f5-4056-9e78-0eb01103e984","Type":"ContainerStarted","Data":"a381c305e0869203116ca244f4b6e9cfb55697fce68abfa55f0745bc18b951e1"} Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.781004 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.791612 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" podStartSLOduration=7.3185586990000004 podStartE2EDuration="23.791592854s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.721420032 +0000 UTC m=+1025.887160930" lastFinishedPulling="2026-03-10 15:23:29.194454187 +0000 UTC m=+1042.360195085" observedRunningTime="2026-03-10 15:23:34.788740462 +0000 UTC m=+1047.954481360" watchObservedRunningTime="2026-03-10 15:23:34.791592854 +0000 UTC m=+1047.957333752" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.815040 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" podStartSLOduration=7.309551072 podStartE2EDuration="23.815022713s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.699741524 +0000 UTC m=+1025.865482422" lastFinishedPulling="2026-03-10 15:23:29.205213165 +0000 UTC m=+1042.370954063" observedRunningTime="2026-03-10 15:23:34.808638911 +0000 UTC m=+1047.974379809" watchObservedRunningTime="2026-03-10 15:23:34.815022713 +0000 UTC m=+1047.980763611" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.836709 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" podStartSLOduration=5.414509118 podStartE2EDuration="23.836690422s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.72380623 +0000 UTC m=+1025.889547128" lastFinishedPulling="2026-03-10 15:23:31.145987534 +0000 UTC m=+1044.311728432" observedRunningTime="2026-03-10 15:23:34.827312104 +0000 UTC m=+1047.993053002" watchObservedRunningTime="2026-03-10 15:23:34.836690422 +0000 UTC m=+1048.002431320" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.855513 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" podStartSLOduration=3.590800992 podStartE2EDuration="23.85549512s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.927986803 +0000 UTC m=+1026.093727701" lastFinishedPulling="2026-03-10 15:23:33.192680931 +0000 UTC m=+1046.358421829" observedRunningTime="2026-03-10 15:23:34.851479315 +0000 UTC m=+1048.017220213" watchObservedRunningTime="2026-03-10 15:23:34.85549512 +0000 UTC m=+1048.021236018" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.886505 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" podStartSLOduration=6.868559759 podStartE2EDuration="23.886486655s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.173843375 +0000 UTC m=+1025.339584273" lastFinishedPulling="2026-03-10 15:23:29.191770261 +0000 UTC m=+1042.357511169" observedRunningTime="2026-03-10 15:23:34.882278825 +0000 UTC m=+1048.048019723" watchObservedRunningTime="2026-03-10 15:23:34.886486655 +0000 UTC m=+1048.052227543" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.900804 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" podStartSLOduration=3.745136152 podStartE2EDuration="23.900789244s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.92506128 +0000 UTC m=+1026.090802178" lastFinishedPulling="2026-03-10 15:23:33.080714362 +0000 UTC m=+1046.246455270" observedRunningTime="2026-03-10 15:23:34.89784804 +0000 UTC m=+1048.063588938" watchObservedRunningTime="2026-03-10 15:23:34.900789244 +0000 UTC m=+1048.066530142" Mar 10 15:23:34 crc kubenswrapper[4795]: I0310 15:23:34.930212 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" podStartSLOduration=3.963477249 podStartE2EDuration="23.930191074s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:13.141893924 +0000 UTC m=+1026.307634812" lastFinishedPulling="2026-03-10 15:23:33.108607739 +0000 UTC m=+1046.274348637" observedRunningTime="2026-03-10 15:23:34.923313357 +0000 UTC m=+1048.089054255" watchObservedRunningTime="2026-03-10 15:23:34.930191074 +0000 UTC m=+1048.095931972" Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.798708 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" event={"ID":"d7d04047-d616-4c35-a6f3-7767688d4393","Type":"ContainerStarted","Data":"4b3f2cd7a2dff2e7370e77dc68f2e826690b26e6c54692dd69792c0710c01525"} Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.800347 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.800371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" event={"ID":"ec0bfd76-f8c1-48a9-b35b-6307d31446e6","Type":"ContainerStarted","Data":"4bb86d7ac8d85b6733a167aa581c29d3a0c80b4837f22d26bf5be2913786ca40"} Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.800475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.830868 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" podStartSLOduration=21.693144 podStartE2EDuration="26.830845499s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:32.338374203 +0000 UTC m=+1045.504115101" lastFinishedPulling="2026-03-10 15:23:37.476075712 +0000 UTC m=+1050.641816600" observedRunningTime="2026-03-10 15:23:37.82353379 +0000 UTC m=+1050.989274698" watchObservedRunningTime="2026-03-10 15:23:37.830845499 +0000 UTC m=+1050.996586397" Mar 10 15:23:37 crc kubenswrapper[4795]: I0310 15:23:37.850757 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" podStartSLOduration=22.431641849000002 podStartE2EDuration="26.850739337s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:33.064055316 +0000 UTC m=+1046.229796214" lastFinishedPulling="2026-03-10 15:23:37.483152804 +0000 UTC m=+1050.648893702" observedRunningTime="2026-03-10 15:23:37.844428897 +0000 UTC m=+1051.010169785" watchObservedRunningTime="2026-03-10 15:23:37.850739337 +0000 UTC m=+1051.016480235" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.407580 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-rgwvl" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.491517 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-tlmlb" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.491573 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-vxz75" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.614910 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-h9tr6" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.707513 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-8k7b2" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.748589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-8tpdc" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.856286 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-kcdqn" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.876664 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-7vht5" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.956294 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-sz4bh" Mar 10 15:23:41 crc kubenswrapper[4795]: I0310 15:23:41.983502 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-pfmkh" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.005243 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-bjdwk" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.064299 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-tglhw" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.133469 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-kd79t" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.167690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-qz5k7" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.211490 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-5xwgv" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.266515 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-fl2gq" Mar 10 15:23:42 crc kubenswrapper[4795]: I0310 15:23:42.267479 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-zsx7m" Mar 10 15:23:43 crc kubenswrapper[4795]: I0310 15:23:43.848932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:43 crc kubenswrapper[4795]: I0310 15:23:43.849577 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:43 crc kubenswrapper[4795]: I0310 15:23:43.859690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-metrics-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:43 crc kubenswrapper[4795]: I0310 15:23:43.865507 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/69a31d53-90dd-46ca-a5ee-8841b89445e6-webhook-certs\") pod \"openstack-operator-controller-manager-76d6f6bb5f-26dcx\" (UID: \"69a31d53-90dd-46ca-a5ee-8841b89445e6\") " pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:44 crc kubenswrapper[4795]: I0310 15:23:44.092316 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:44 crc kubenswrapper[4795]: I0310 15:23:44.536323 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx"] Mar 10 15:23:44 crc kubenswrapper[4795]: W0310 15:23:44.538338 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a31d53_90dd_46ca_a5ee_8841b89445e6.slice/crio-d8abdbf20d812259b294bac9558113811cebbd8b060550347f87a366eb84a072 WatchSource:0}: Error finding container d8abdbf20d812259b294bac9558113811cebbd8b060550347f87a366eb84a072: Status 404 returned error can't find the container with id d8abdbf20d812259b294bac9558113811cebbd8b060550347f87a366eb84a072 Mar 10 15:23:44 crc kubenswrapper[4795]: I0310 15:23:44.871265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" event={"ID":"69a31d53-90dd-46ca-a5ee-8841b89445e6","Type":"ContainerStarted","Data":"d8abdbf20d812259b294bac9558113811cebbd8b060550347f87a366eb84a072"} Mar 10 15:23:48 crc kubenswrapper[4795]: I0310 15:23:48.204668 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-5995f4446f-4vlhk" Mar 10 15:23:48 crc kubenswrapper[4795]: I0310 15:23:48.211531 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb" Mar 10 15:23:53 crc kubenswrapper[4795]: I0310 15:23:53.362359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" event={"ID":"69a31d53-90dd-46ca-a5ee-8841b89445e6","Type":"ContainerStarted","Data":"fffebeb2dfbabe8d4e4d28f8b695d9a06dff434c1156c5c44810fc8c7975278a"} Mar 10 15:23:53 crc kubenswrapper[4795]: I0310 15:23:53.363013 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:23:53 crc kubenswrapper[4795]: I0310 15:23:53.394080 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" podStartSLOduration=42.394047996 podStartE2EDuration="42.394047996s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:23:53.389783085 +0000 UTC m=+1066.555523983" watchObservedRunningTime="2026-03-10 15:23:53.394047996 +0000 UTC m=+1066.559788894" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.150883 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552604-2x2lf"] Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.152212 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.155108 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.155374 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.158368 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.164163 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-2x2lf"] Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.343811 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9r8t\" (UniqueName: \"kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t\") pod \"auto-csr-approver-29552604-2x2lf\" (UID: \"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9\") " pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.415709 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" event={"ID":"552bb17b-df18-40ea-8688-f5f5c16e7c5b","Type":"ContainerStarted","Data":"67296266cca5d1e2f83bfb6a5e996db835e5acbd566b7389f4a250927dba29ed"} Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.416236 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.418627 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" event={"ID":"44fbebe4-6a17-4378-9f39-bda40adb7e02","Type":"ContainerStarted","Data":"267dc9797e58e5334c1f15d651862842c40861776d6ee3772b4de050900b458c"} Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.420648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" event={"ID":"9968b76f-ff42-4d0e-9096-a229cf314dcb","Type":"ContainerStarted","Data":"e5d76318846807be910126b781b378b0b32df9beda64013748cd093b6f576a22"} Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.420984 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.445434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9r8t\" (UniqueName: \"kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t\") pod \"auto-csr-approver-29552604-2x2lf\" (UID: \"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9\") " pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.450484 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" podStartSLOduration=2.016865851 podStartE2EDuration="49.450458335s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.357725859 +0000 UTC m=+1025.523466767" lastFinishedPulling="2026-03-10 15:23:59.791318363 +0000 UTC m=+1072.957059251" observedRunningTime="2026-03-10 15:24:00.444055522 +0000 UTC m=+1073.609796420" watchObservedRunningTime="2026-03-10 15:24:00.450458335 +0000 UTC m=+1073.616199243" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.473397 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" podStartSLOduration=2.402484481 podStartE2EDuration="49.47336578s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:12.720939228 +0000 UTC m=+1025.886680126" lastFinishedPulling="2026-03-10 15:23:59.791820527 +0000 UTC m=+1072.957561425" observedRunningTime="2026-03-10 15:24:00.46673568 +0000 UTC m=+1073.632476618" watchObservedRunningTime="2026-03-10 15:24:00.47336578 +0000 UTC m=+1073.639106728" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.480155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9r8t\" (UniqueName: \"kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t\") pod \"auto-csr-approver-29552604-2x2lf\" (UID: \"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9\") " pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.483337 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.506713 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hkmpm" podStartSLOduration=2.822845991 podStartE2EDuration="49.506668081s" podCreationTimestamp="2026-03-10 15:23:11 +0000 UTC" firstStartedPulling="2026-03-10 15:23:13.107569695 +0000 UTC m=+1026.273310603" lastFinishedPulling="2026-03-10 15:23:59.791391755 +0000 UTC m=+1072.957132693" observedRunningTime="2026-03-10 15:24:00.489938393 +0000 UTC m=+1073.655679321" watchObservedRunningTime="2026-03-10 15:24:00.506668081 +0000 UTC m=+1073.672408989" Mar 10 15:24:00 crc kubenswrapper[4795]: I0310 15:24:00.975655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-2x2lf"] Mar 10 15:24:00 crc kubenswrapper[4795]: W0310 15:24:00.981221 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e6ddbe_efaa_4fe9_a5b8_2e3547d073d9.slice/crio-06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574 WatchSource:0}: Error finding container 06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574: Status 404 returned error can't find the container with id 06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574 Mar 10 15:24:01 crc kubenswrapper[4795]: I0310 15:24:01.427925 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" event={"ID":"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9","Type":"ContainerStarted","Data":"06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574"} Mar 10 15:24:04 crc kubenswrapper[4795]: I0310 15:24:04.097976 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-76d6f6bb5f-26dcx" Mar 10 15:24:04 crc kubenswrapper[4795]: I0310 15:24:04.458499 4795 generic.go:334] "Generic (PLEG): container finished" podID="43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" containerID="dbff0b7c150499c38617bf7d5602148c0546863529a9fd0f232fbe78a9a6b56f" exitCode=0 Mar 10 15:24:04 crc kubenswrapper[4795]: I0310 15:24:04.458552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" event={"ID":"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9","Type":"ContainerDied","Data":"dbff0b7c150499c38617bf7d5602148c0546863529a9fd0f232fbe78a9a6b56f"} Mar 10 15:24:05 crc kubenswrapper[4795]: I0310 15:24:05.725027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:05 crc kubenswrapper[4795]: I0310 15:24:05.910492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9r8t\" (UniqueName: \"kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t\") pod \"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9\" (UID: \"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9\") " Mar 10 15:24:05 crc kubenswrapper[4795]: I0310 15:24:05.919445 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t" (OuterVolumeSpecName: "kube-api-access-w9r8t") pod "43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" (UID: "43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9"). InnerVolumeSpecName "kube-api-access-w9r8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.012011 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9r8t\" (UniqueName: \"kubernetes.io/projected/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9-kube-api-access-w9r8t\") on node \"crc\" DevicePath \"\"" Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.476428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" event={"ID":"43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9","Type":"ContainerDied","Data":"06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574"} Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.476463 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06685eb49665dea2f53fd2ec084274980664192d7278435449149fff93df0574" Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.476526 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552604-2x2lf" Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.819040 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-4zw7j"] Mar 10 15:24:06 crc kubenswrapper[4795]: I0310 15:24:06.828467 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552598-4zw7j"] Mar 10 15:24:07 crc kubenswrapper[4795]: I0310 15:24:07.496827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf4236b-2e67-46b2-9d0b-f38d75ed5213" path="/var/lib/kubelet/pods/1cf4236b-2e67-46b2-9d0b-f38d75ed5213/volumes" Mar 10 15:24:11 crc kubenswrapper[4795]: I0310 15:24:11.491679 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-qhk6r" Mar 10 15:24:11 crc kubenswrapper[4795]: I0310 15:24:11.696345 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-xhvcm" Mar 10 15:24:26 crc kubenswrapper[4795]: E0310 15:24:26.561840 4795 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.086s" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.984559 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:24:29 crc kubenswrapper[4795]: E0310 15:24:29.985112 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" containerName="oc" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.985124 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" containerName="oc" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.985257 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" containerName="oc" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.985884 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.987767 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.987928 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-hcf9w" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.988116 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.988245 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.988651 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.989414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.989638 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:29 crc kubenswrapper[4795]: I0310 15:24:29.989686 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5j6m\" (UniqueName: \"kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.047656 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.091036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5j6m\" (UniqueName: \"kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.091153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.091317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.091939 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.092053 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.116401 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5j6m\" (UniqueName: \"kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m\") pod \"dnsmasq-dns-78dd6ddcc-lgtgt\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.302787 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:24:30 crc kubenswrapper[4795]: I0310 15:24:30.755409 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:24:34 crc kubenswrapper[4795]: I0310 15:24:34.839717 4795 scope.go:117] "RemoveContainer" containerID="c51f599633bfc406ad50a24970472587432a834637b95b39b22fadb24e99db5e" Mar 10 15:24:34 crc kubenswrapper[4795]: I0310 15:24:34.902683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" event={"ID":"9bdd186c-fb5a-43f3-967e-fb0123a920b2","Type":"ContainerStarted","Data":"d90a55c9d9bb788c46723a692bde37d79bdc83e5916b8d83b7ee5fa4fde0eaae"} Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.630706 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.631711 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.650870 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.697085 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.697227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.697334 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqvk2\" (UniqueName: \"kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.798551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqvk2\" (UniqueName: \"kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.798658 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.798712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.799798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.800492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.843804 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqvk2\" (UniqueName: \"kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2\") pod \"dnsmasq-dns-5ccc8479f9-krr6z\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.869055 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.906878 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.908048 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.919522 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:24:35 crc kubenswrapper[4795]: I0310 15:24:35.949131 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.001233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hsg\" (UniqueName: \"kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.001611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.001654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.102528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.102592 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.102679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hsg\" (UniqueName: \"kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.104097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.104868 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.138122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hsg\" (UniqueName: \"kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg\") pod \"dnsmasq-dns-57d769cc4f-czckz\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.221616 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.729032 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.730525 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.742646 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.742745 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.742833 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.743113 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.743137 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f5j5r" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.743228 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.743482 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.757160 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.768139 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810480 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810524 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810556 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810634 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810852 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810881 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkmzq\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.810951 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.811076 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.912854 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.912906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.912951 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.912981 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.912996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913045 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913096 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkmzq\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913134 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.913166 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.914200 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.914682 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.914739 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.915289 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.915423 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" event={"ID":"f54b4a29-0fff-4ce9-abcb-a14977f1d101","Type":"ContainerStarted","Data":"cdc3df8887c9725ff3cad17e97533e6f9955f1236cd59c36ebc492edbd8f09e0"} Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.915975 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.917340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.922584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.923019 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.923027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.924249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.928874 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkmzq\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.932036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.936386 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.937835 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.940086 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.941759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.941700 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-xzvpn" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.941854 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.948185 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:24:36 crc kubenswrapper[4795]: I0310 15:24:36.949611 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014319 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014685 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.014952 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fct7\" (UniqueName: \"kubernetes.io/projected/b8e5711d-12e4-458f-a944-6b37aca4afa3-kube-api-access-5fct7\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.015026 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.015129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.033790 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.035087 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.037109 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.037438 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.037703 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.037888 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.037994 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.038262 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-24pfv" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.039553 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.061166 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.069410 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.114587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117031 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fct7\" (UniqueName: \"kubernetes.io/projected/b8e5711d-12e4-458f-a944-6b37aca4afa3-kube-api-access-5fct7\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117499 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117744 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117810 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rv5s\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.117958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118021 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118117 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118709 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118797 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.118934 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.119017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.119104 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.119303 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.119595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.120400 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-default\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.120680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/b8e5711d-12e4-458f-a944-6b37aca4afa3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.121362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b8e5711d-12e4-458f-a944-6b37aca4afa3-kolla-config\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.121995 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.122769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8e5711d-12e4-458f-a944-6b37aca4afa3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.144734 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fct7\" (UniqueName: \"kubernetes.io/projected/b8e5711d-12e4-458f-a944-6b37aca4afa3-kube-api-access-5fct7\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.151787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-galera-0\" (UID: \"b8e5711d-12e4-458f-a944-6b37aca4afa3\") " pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220151 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220505 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220583 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220675 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220727 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rv5s\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220802 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.220824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.221206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.221545 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.222218 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.222430 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.222879 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.223513 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.225000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.225579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.229746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.232022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.242789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.245005 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rv5s\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s\") pod \"rabbitmq-server-0\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " pod="openstack/rabbitmq-server-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.256132 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 10 15:24:37 crc kubenswrapper[4795]: I0310 15:24:37.355800 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:37.995503 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" event={"ID":"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00","Type":"ContainerStarted","Data":"64b17beacfc85a170d7bed2a36ace94ec146cc3377ff0ba7c209d3de31a5fb7e"} Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.136696 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.140171 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.143462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.143586 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-766x5" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.143820 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.144206 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.154955 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267795 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267826 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267847 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267863 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.267892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.268034 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.268167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvl5\" (UniqueName: \"kubernetes.io/projected/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kube-api-access-pwvl5\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.332703 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.339293 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.354595 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.354788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-xxvfh" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.354816 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369498 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369566 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369597 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369679 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.369716 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvl5\" (UniqueName: \"kubernetes.io/projected/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kube-api-access-pwvl5\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.371803 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.373423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.374136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.377529 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.380863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.385187 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.401826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.413931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvl5\" (UniqueName: \"kubernetes.io/projected/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-kube-api-access-pwvl5\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.415094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.420729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d\") " pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.466364 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.469973 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.470501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-config-data\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.470559 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kolla-config\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.470596 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.470612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kube-api-access-682gl\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.470640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: W0310 15:24:38.482649 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0778eb_949e_46e9_bc72_cc42ec440aa2.slice/crio-77e560db3899b1ef8ba7d47cb762ecd235a93da39c5376d9cccb6e3aa082d09a WatchSource:0}: Error finding container 77e560db3899b1ef8ba7d47cb762ecd235a93da39c5376d9cccb6e3aa082d09a: Status 404 returned error can't find the container with id 77e560db3899b1ef8ba7d47cb762ecd235a93da39c5376d9cccb6e3aa082d09a Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.572114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.572152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kube-api-access-682gl\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.572185 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.572246 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-config-data\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.572275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kolla-config\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.573548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kolla-config\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.574617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-config-data\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.579485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-memcached-tls-certs\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.590798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-combined-ca-bundle\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.598226 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-682gl\" (UniqueName: \"kubernetes.io/projected/f3f06c2a-0098-46d6-96e5-6cbe9caf24ef-kube-api-access-682gl\") pod \"memcached-0\" (UID: \"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef\") " pod="openstack/memcached-0" Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.624303 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.634997 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:24:38 crc kubenswrapper[4795]: W0310 15:24:38.640507 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1ccf1a8_3778_482d_b6b5_303de43c6a7e.slice/crio-c240581124d1424e85a4e35aae8efb12e4614062bbfd9b8f8a1c31d6e9146809 WatchSource:0}: Error finding container c240581124d1424e85a4e35aae8efb12e4614062bbfd9b8f8a1c31d6e9146809: Status 404 returned error can't find the container with id c240581124d1424e85a4e35aae8efb12e4614062bbfd9b8f8a1c31d6e9146809 Mar 10 15:24:38 crc kubenswrapper[4795]: W0310 15:24:38.649381 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8e5711d_12e4_458f_a944_6b37aca4afa3.slice/crio-c180a1c5901f7b85639ba071132a0206a363d6caaf78f7197b685d801770e391 WatchSource:0}: Error finding container c180a1c5901f7b85639ba071132a0206a363d6caaf78f7197b685d801770e391: Status 404 returned error can't find the container with id c180a1c5901f7b85639ba071132a0206a363d6caaf78f7197b685d801770e391 Mar 10 15:24:38 crc kubenswrapper[4795]: I0310 15:24:38.669544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 10 15:24:39 crc kubenswrapper[4795]: I0310 15:24:39.009808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerStarted","Data":"c240581124d1424e85a4e35aae8efb12e4614062bbfd9b8f8a1c31d6e9146809"} Mar 10 15:24:39 crc kubenswrapper[4795]: I0310 15:24:39.010283 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 10 15:24:39 crc kubenswrapper[4795]: I0310 15:24:39.013923 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerStarted","Data":"77e560db3899b1ef8ba7d47cb762ecd235a93da39c5376d9cccb6e3aa082d09a"} Mar 10 15:24:39 crc kubenswrapper[4795]: I0310 15:24:39.016012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8e5711d-12e4-458f-a944-6b37aca4afa3","Type":"ContainerStarted","Data":"c180a1c5901f7b85639ba071132a0206a363d6caaf78f7197b685d801770e391"} Mar 10 15:24:39 crc kubenswrapper[4795]: W0310 15:24:39.036277 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd0fbf0b_7154_4f2e_b5b8_896cdceb4d0d.slice/crio-0db635cbce76ff6565465a0aede581e0ee9ce115bb61b43c34ff1a1b9688f15d WatchSource:0}: Error finding container 0db635cbce76ff6565465a0aede581e0ee9ce115bb61b43c34ff1a1b9688f15d: Status 404 returned error can't find the container with id 0db635cbce76ff6565465a0aede581e0ee9ce115bb61b43c34ff1a1b9688f15d Mar 10 15:24:39 crc kubenswrapper[4795]: I0310 15:24:39.488144 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.023220 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d","Type":"ContainerStarted","Data":"0db635cbce76ff6565465a0aede581e0ee9ce115bb61b43c34ff1a1b9688f15d"} Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.024216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef","Type":"ContainerStarted","Data":"2e04cf6b732007e7d428b3131bcd32f8ade59673e1144428909a46a4140015e0"} Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.575439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.576285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.585377 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-v6zm9" Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.588894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.758887 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md4b2\" (UniqueName: \"kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2\") pod \"kube-state-metrics-0\" (UID: \"9095d427-1630-4118-881c-eca71ebf01dc\") " pod="openstack/kube-state-metrics-0" Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.861997 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md4b2\" (UniqueName: \"kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2\") pod \"kube-state-metrics-0\" (UID: \"9095d427-1630-4118-881c-eca71ebf01dc\") " pod="openstack/kube-state-metrics-0" Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.894350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md4b2\" (UniqueName: \"kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2\") pod \"kube-state-metrics-0\" (UID: \"9095d427-1630-4118-881c-eca71ebf01dc\") " pod="openstack/kube-state-metrics-0" Mar 10 15:24:40 crc kubenswrapper[4795]: I0310 15:24:40.908393 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:24:41 crc kubenswrapper[4795]: I0310 15:24:41.375147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:24:42 crc kubenswrapper[4795]: I0310 15:24:42.042941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9095d427-1630-4118-881c-eca71ebf01dc","Type":"ContainerStarted","Data":"28f3dbc3e1226ac07706955a95e36c25b3ca9fc1767580d98b20a2dcd26dc6e1"} Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.370304 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x9p5v"] Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.372463 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.374415 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-z7kxr" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.376343 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.376497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.376586 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-sblqq"] Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.377930 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.385466 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v"] Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.403372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sblqq"] Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.515617 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-log-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.515666 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-run\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.515691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-ovn-controller-tls-certs\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.516526 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.516635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-lib\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.516674 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.516764 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-combined-ca-bundle\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.516896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f8a962-ce88-4e90-91b3-5272104b9d18-scripts\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.517026 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4823094a-6e7f-49da-9aa5-7d67b893896c-scripts\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.517112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk266\" (UniqueName: \"kubernetes.io/projected/4823094a-6e7f-49da-9aa5-7d67b893896c-kube-api-access-hk266\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.517141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-log\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.517225 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fzg\" (UniqueName: \"kubernetes.io/projected/70f8a962-ce88-4e90-91b3-5272104b9d18-kube-api-access-92fzg\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.517281 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-etc-ovs\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.619912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk266\" (UniqueName: \"kubernetes.io/projected/4823094a-6e7f-49da-9aa5-7d67b893896c-kube-api-access-hk266\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.619955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-log\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.619995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fzg\" (UniqueName: \"kubernetes.io/projected/70f8a962-ce88-4e90-91b3-5272104b9d18-kube-api-access-92fzg\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.620021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-etc-ovs\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.620076 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-log-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.620117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-run\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.621883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-etc-ovs\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.621924 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-log\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.622202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-log-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.622552 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-run\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-ovn-controller-tls-certs\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626167 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-lib\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626222 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-combined-ca-bundle\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626317 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f8a962-ce88-4e90-91b3-5272104b9d18-scripts\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4823094a-6e7f-49da-9aa5-7d67b893896c-scripts\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626445 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626569 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/4823094a-6e7f-49da-9aa5-7d67b893896c-var-run-ovn\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.626569 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/70f8a962-ce88-4e90-91b3-5272104b9d18-var-lib\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.628536 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/70f8a962-ce88-4e90-91b3-5272104b9d18-scripts\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.629907 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4823094a-6e7f-49da-9aa5-7d67b893896c-scripts\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.632288 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-combined-ca-bundle\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.637447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/4823094a-6e7f-49da-9aa5-7d67b893896c-ovn-controller-tls-certs\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.642516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fzg\" (UniqueName: \"kubernetes.io/projected/70f8a962-ce88-4e90-91b3-5272104b9d18-kube-api-access-92fzg\") pod \"ovn-controller-ovs-sblqq\" (UID: \"70f8a962-ce88-4e90-91b3-5272104b9d18\") " pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.644533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk266\" (UniqueName: \"kubernetes.io/projected/4823094a-6e7f-49da-9aa5-7d67b893896c-kube-api-access-hk266\") pod \"ovn-controller-x9p5v\" (UID: \"4823094a-6e7f-49da-9aa5-7d67b893896c\") " pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.702914 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v" Mar 10 15:24:43 crc kubenswrapper[4795]: I0310 15:24:43.710428 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.845909 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.848604 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.850418 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.851149 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.852087 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.853444 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8wg22" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.853654 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 10 15:24:44 crc kubenswrapper[4795]: I0310 15:24:44.862125 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049297 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049366 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57gkz\" (UniqueName: \"kubernetes.io/projected/8572cc94-1e6e-406c-b57b-56167baa0a87-kube-api-access-57gkz\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049461 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-config\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.049548 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150545 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57gkz\" (UniqueName: \"kubernetes.io/projected/8572cc94-1e6e-406c-b57b-56167baa0a87-kube-api-access-57gkz\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150629 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-config\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.150813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.151378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.152128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-config\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.152047 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8572cc94-1e6e-406c-b57b-56167baa0a87-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.152633 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.156690 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.164659 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.170350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57gkz\" (UniqueName: \"kubernetes.io/projected/8572cc94-1e6e-406c-b57b-56167baa0a87-kube-api-access-57gkz\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.172769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.183413 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8572cc94-1e6e-406c-b57b-56167baa0a87-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"8572cc94-1e6e-406c-b57b-56167baa0a87\") " pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:45 crc kubenswrapper[4795]: I0310 15:24:45.480692 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.512131 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.515596 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.522914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.558485 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.559244 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.559846 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-2gmmt" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.560284 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701249 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lklf4\" (UniqueName: \"kubernetes.io/projected/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-kube-api-access-lklf4\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-config\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701711 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701805 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701891 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.701975 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.702089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.803644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.803884 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.803971 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.804326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.804452 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.805106 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.804820 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.804670 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.805112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lklf4\" (UniqueName: \"kubernetes.io/projected/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-kube-api-access-lklf4\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.805408 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.805456 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-config\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.806395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-config\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.812168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.812755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.813110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.828238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lklf4\" (UniqueName: \"kubernetes.io/projected/d58ad85e-6f98-4ba8-97b9-656dda7a5b93-kube-api-access-lklf4\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.833471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d58ad85e-6f98-4ba8-97b9-656dda7a5b93\") " pod="openstack/ovsdbserver-nb-0" Mar 10 15:24:47 crc kubenswrapper[4795]: I0310 15:24:47.908001 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:00 crc kubenswrapper[4795]: E0310 15:25:00.217547 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Mar 10 15:25:00 crc kubenswrapper[4795]: E0310 15:25:00.218581 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fkmzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(9b0778eb-949e-46e9-bc72-cc42ec440aa2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:00 crc kubenswrapper[4795]: E0310 15:25:00.219811 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.225902 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.766440 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.766614 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q5j6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-lgtgt_openstack(9bdd186c-fb5a-43f3-967e-fb0123a920b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.768229 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" podUID="9bdd186c-fb5a-43f3-967e-fb0123a920b2" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.940482 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.940887 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vqvk2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-krr6z_openstack(f54b4a29-0fff-4ce9-abcb-a14977f1d101): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.942113 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" podUID="f54b4a29-0fff-4ce9-abcb-a14977f1d101" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.961867 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.962012 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m5hsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-czckz_openstack(2051a93f-4a2a-4d8a-b7c3-f4d24791dc00): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:25:01 crc kubenswrapper[4795]: E0310 15:25:01.963111 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" podUID="2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.172027 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v"] Mar 10 15:25:02 crc kubenswrapper[4795]: E0310 15:25:02.235212 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" podUID="f54b4a29-0fff-4ce9-abcb-a14977f1d101" Mar 10 15:25:02 crc kubenswrapper[4795]: E0310 15:25:02.235616 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" podUID="2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.478069 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.555045 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.591292 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.714422 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-sblqq"] Mar 10 15:25:02 crc kubenswrapper[4795]: W0310 15:25:02.737208 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd58ad85e_6f98_4ba8_97b9_656dda7a5b93.slice/crio-447eb8bc1f8e39bb6a8925f17029d55157b602015b1d85cb23c91a69470195ee WatchSource:0}: Error finding container 447eb8bc1f8e39bb6a8925f17029d55157b602015b1d85cb23c91a69470195ee: Status 404 returned error can't find the container with id 447eb8bc1f8e39bb6a8925f17029d55157b602015b1d85cb23c91a69470195ee Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.742990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5j6m\" (UniqueName: \"kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m\") pod \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.743054 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config\") pod \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.743208 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc\") pod \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\" (UID: \"9bdd186c-fb5a-43f3-967e-fb0123a920b2\") " Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.744492 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9bdd186c-fb5a-43f3-967e-fb0123a920b2" (UID: "9bdd186c-fb5a-43f3-967e-fb0123a920b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.745529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config" (OuterVolumeSpecName: "config") pod "9bdd186c-fb5a-43f3-967e-fb0123a920b2" (UID: "9bdd186c-fb5a-43f3-967e-fb0123a920b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.748367 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m" (OuterVolumeSpecName: "kube-api-access-q5j6m") pod "9bdd186c-fb5a-43f3-967e-fb0123a920b2" (UID: "9bdd186c-fb5a-43f3-967e-fb0123a920b2"). InnerVolumeSpecName "kube-api-access-q5j6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.845000 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.845030 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5j6m\" (UniqueName: \"kubernetes.io/projected/9bdd186c-fb5a-43f3-967e-fb0123a920b2-kube-api-access-q5j6m\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:02 crc kubenswrapper[4795]: I0310 15:25:02.845041 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bdd186c-fb5a-43f3-967e-fb0123a920b2-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.240467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d","Type":"ContainerStarted","Data":"46934d312ffcd073623b8fde1a9adaab85c2af31a5e4e67c21fb8ffd45a8dce6"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.242443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v" event={"ID":"4823094a-6e7f-49da-9aa5-7d67b893896c","Type":"ContainerStarted","Data":"94006b01a32353e291007cc8ffd99fce279bc6f39c90e483a6e510106afa8d72"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.244154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" event={"ID":"9bdd186c-fb5a-43f3-967e-fb0123a920b2","Type":"ContainerDied","Data":"d90a55c9d9bb788c46723a692bde37d79bdc83e5916b8d83b7ee5fa4fde0eaae"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.244195 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-lgtgt" Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.257201 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8572cc94-1e6e-406c-b57b-56167baa0a87","Type":"ContainerStarted","Data":"befb83298ce046315c794b00f88c55f865b46b2496c67f7dc13f2d517a3d7eec"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.269588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sblqq" event={"ID":"70f8a962-ce88-4e90-91b3-5272104b9d18","Type":"ContainerStarted","Data":"5d1be8152e60ed069be4de90aef69d63928a1c23734048b2e0ea80c14537344c"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.271374 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d58ad85e-6f98-4ba8-97b9-656dda7a5b93","Type":"ContainerStarted","Data":"447eb8bc1f8e39bb6a8925f17029d55157b602015b1d85cb23c91a69470195ee"} Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.326013 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.334333 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-lgtgt"] Mar 10 15:25:03 crc kubenswrapper[4795]: I0310 15:25:03.490175 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bdd186c-fb5a-43f3-967e-fb0123a920b2" path="/var/lib/kubelet/pods/9bdd186c-fb5a-43f3-967e-fb0123a920b2/volumes" Mar 10 15:25:04 crc kubenswrapper[4795]: I0310 15:25:04.283880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerStarted","Data":"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046"} Mar 10 15:25:04 crc kubenswrapper[4795]: I0310 15:25:04.286180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"f3f06c2a-0098-46d6-96e5-6cbe9caf24ef","Type":"ContainerStarted","Data":"aaffaf05bd8d52fc286b79573796b6eea1ef223a681b790bb1fd9b03d7c1849f"} Mar 10 15:25:04 crc kubenswrapper[4795]: I0310 15:25:04.286316 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 10 15:25:04 crc kubenswrapper[4795]: I0310 15:25:04.289452 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8e5711d-12e4-458f-a944-6b37aca4afa3","Type":"ContainerStarted","Data":"05c97856e61f8a2281ea4a500551f6e4db70686b3178744a82ffe84ad336652d"} Mar 10 15:25:04 crc kubenswrapper[4795]: I0310 15:25:04.365360 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=3.459429699 podStartE2EDuration="26.365340025s" podCreationTimestamp="2026-03-10 15:24:38 +0000 UTC" firstStartedPulling="2026-03-10 15:24:39.518966328 +0000 UTC m=+1112.684707226" lastFinishedPulling="2026-03-10 15:25:02.424876644 +0000 UTC m=+1135.590617552" observedRunningTime="2026-03-10 15:25:04.356044969 +0000 UTC m=+1137.521785867" watchObservedRunningTime="2026-03-10 15:25:04.365340025 +0000 UTC m=+1137.531080923" Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.310934 4795 generic.go:334] "Generic (PLEG): container finished" podID="b8e5711d-12e4-458f-a944-6b37aca4afa3" containerID="05c97856e61f8a2281ea4a500551f6e4db70686b3178744a82ffe84ad336652d" exitCode=0 Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.311413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8e5711d-12e4-458f-a944-6b37aca4afa3","Type":"ContainerDied","Data":"05c97856e61f8a2281ea4a500551f6e4db70686b3178744a82ffe84ad336652d"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.315653 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8572cc94-1e6e-406c-b57b-56167baa0a87","Type":"ContainerStarted","Data":"a0520f7a2c97be19d7ef70c7aaf3f6b6982d9be062f8e4f33a0cea31de3b01e5"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.317717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sblqq" event={"ID":"70f8a962-ce88-4e90-91b3-5272104b9d18","Type":"ContainerStarted","Data":"df69db92fd618ba4ef4fd253c8c2284c8b2efdad131415b8f3c612326fdca7c4"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.318876 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d58ad85e-6f98-4ba8-97b9-656dda7a5b93","Type":"ContainerStarted","Data":"6df6d0c860144ac45d75e79f5c18fa59192891338c673d66046b874ead33fdff"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.322266 4795 generic.go:334] "Generic (PLEG): container finished" podID="cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d" containerID="46934d312ffcd073623b8fde1a9adaab85c2af31a5e4e67c21fb8ffd45a8dce6" exitCode=0 Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.322381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d","Type":"ContainerDied","Data":"46934d312ffcd073623b8fde1a9adaab85c2af31a5e4e67c21fb8ffd45a8dce6"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.334017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v" event={"ID":"4823094a-6e7f-49da-9aa5-7d67b893896c","Type":"ContainerStarted","Data":"d39e7ecb665d41a4920b9e8f77ed4004372ecdcd492fe61d94008e3a3499222c"} Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.334748 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-x9p5v" Mar 10 15:25:07 crc kubenswrapper[4795]: I0310 15:25:07.366841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x9p5v" podStartSLOduration=20.091012145 podStartE2EDuration="24.36682757s" podCreationTimestamp="2026-03-10 15:24:43 +0000 UTC" firstStartedPulling="2026-03-10 15:25:02.437221946 +0000 UTC m=+1135.602962844" lastFinishedPulling="2026-03-10 15:25:06.713037351 +0000 UTC m=+1139.878778269" observedRunningTime="2026-03-10 15:25:07.364563756 +0000 UTC m=+1140.530304654" watchObservedRunningTime="2026-03-10 15:25:07.36682757 +0000 UTC m=+1140.532568468" Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.345564 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d","Type":"ContainerStarted","Data":"33e285eb5b5291a7c6a4be7c1bbceb874d18623cef5eacd4a969067807a5e955"} Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.351936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"b8e5711d-12e4-458f-a944-6b37aca4afa3","Type":"ContainerStarted","Data":"2f51cbd5234fdb06b975d221e9ba994da876fb9cc7d4bb3cf9b940356dfa6347"} Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.354740 4795 generic.go:334] "Generic (PLEG): container finished" podID="70f8a962-ce88-4e90-91b3-5272104b9d18" containerID="df69db92fd618ba4ef4fd253c8c2284c8b2efdad131415b8f3c612326fdca7c4" exitCode=0 Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.354821 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sblqq" event={"ID":"70f8a962-ce88-4e90-91b3-5272104b9d18","Type":"ContainerDied","Data":"df69db92fd618ba4ef4fd253c8c2284c8b2efdad131415b8f3c612326fdca7c4"} Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.366635 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.458252441 podStartE2EDuration="31.366612415s" podCreationTimestamp="2026-03-10 15:24:37 +0000 UTC" firstStartedPulling="2026-03-10 15:24:39.038929024 +0000 UTC m=+1112.204669922" lastFinishedPulling="2026-03-10 15:25:01.947288998 +0000 UTC m=+1135.113029896" observedRunningTime="2026-03-10 15:25:08.363644351 +0000 UTC m=+1141.529385269" watchObservedRunningTime="2026-03-10 15:25:08.366612415 +0000 UTC m=+1141.532353313" Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.411976 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=9.525440165 podStartE2EDuration="33.411952901s" podCreationTimestamp="2026-03-10 15:24:35 +0000 UTC" firstStartedPulling="2026-03-10 15:24:38.650821101 +0000 UTC m=+1111.816561999" lastFinishedPulling="2026-03-10 15:25:02.537333837 +0000 UTC m=+1135.703074735" observedRunningTime="2026-03-10 15:25:08.407979317 +0000 UTC m=+1141.573720215" watchObservedRunningTime="2026-03-10 15:25:08.411952901 +0000 UTC m=+1141.577693799" Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.467975 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.468030 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:08 crc kubenswrapper[4795]: I0310 15:25:08.680196 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 10 15:25:09 crc kubenswrapper[4795]: I0310 15:25:09.367022 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sblqq" event={"ID":"70f8a962-ce88-4e90-91b3-5272104b9d18","Type":"ContainerStarted","Data":"96390f7b8e4ffcee63e0494951c7fca4838245e6bd89e8a3442015f1a6e1a91c"} Mar 10 15:25:10 crc kubenswrapper[4795]: I0310 15:25:10.972844 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.028119 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.029462 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.040741 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.135961 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt4wk\" (UniqueName: \"kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.136087 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.136107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.237381 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.237420 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.237478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt4wk\" (UniqueName: \"kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.238617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.238793 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.263056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt4wk\" (UniqueName: \"kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk\") pod \"dnsmasq-dns-7cb5889db5-84fmc\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.297025 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.338770 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc\") pod \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.338844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config\") pod \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.338874 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hsg\" (UniqueName: \"kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg\") pod \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\" (UID: \"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00\") " Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.339629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" (UID: "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.339657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config" (OuterVolumeSpecName: "config") pod "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" (UID: "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.344694 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg" (OuterVolumeSpecName: "kube-api-access-m5hsg") pod "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" (UID: "2051a93f-4a2a-4d8a-b7c3-f4d24791dc00"). InnerVolumeSpecName "kube-api-access-m5hsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.356488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.383285 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9095d427-1630-4118-881c-eca71ebf01dc","Type":"ContainerStarted","Data":"ab8f1913a85a8307639a03ea12df17dcb12f506626cc4c85c89f9ee2e7880154"} Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.384137 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.388359 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" event={"ID":"2051a93f-4a2a-4d8a-b7c3-f4d24791dc00","Type":"ContainerDied","Data":"64b17beacfc85a170d7bed2a36ace94ec146cc3377ff0ba7c209d3de31a5fb7e"} Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.388460 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-czckz" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.393304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"8572cc94-1e6e-406c-b57b-56167baa0a87","Type":"ContainerStarted","Data":"8c12824fe955f7fa7349cf5063b87fdc73f3271b7879e597fec8823472c678cb"} Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.400175 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.311777673 podStartE2EDuration="31.400154137s" podCreationTimestamp="2026-03-10 15:24:40 +0000 UTC" firstStartedPulling="2026-03-10 15:24:41.384010755 +0000 UTC m=+1114.549751653" lastFinishedPulling="2026-03-10 15:25:10.472387219 +0000 UTC m=+1143.638128117" observedRunningTime="2026-03-10 15:25:11.398366316 +0000 UTC m=+1144.564107214" watchObservedRunningTime="2026-03-10 15:25:11.400154137 +0000 UTC m=+1144.565895035" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.401955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-sblqq" event={"ID":"70f8a962-ce88-4e90-91b3-5272104b9d18","Type":"ContainerStarted","Data":"43930b308274198fce86ef7e4c89460dc09c58ea5afa61008d4a89587b1a2c6d"} Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.402019 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.402116 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.404421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d58ad85e-6f98-4ba8-97b9-656dda7a5b93","Type":"ContainerStarted","Data":"f564afcf5695f8a46f99ddbef42f4567420b18484053150a9f7f3ce6b05c8ea1"} Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.418587 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=20.440776199 podStartE2EDuration="28.418571653s" podCreationTimestamp="2026-03-10 15:24:43 +0000 UTC" firstStartedPulling="2026-03-10 15:25:02.578238085 +0000 UTC m=+1135.743978983" lastFinishedPulling="2026-03-10 15:25:10.556033539 +0000 UTC m=+1143.721774437" observedRunningTime="2026-03-10 15:25:11.415982279 +0000 UTC m=+1144.581723177" watchObservedRunningTime="2026-03-10 15:25:11.418571653 +0000 UTC m=+1144.584312551" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.442895 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.442939 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.442951 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hsg\" (UniqueName: \"kubernetes.io/projected/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00-kube-api-access-m5hsg\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.455582 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.649861003 podStartE2EDuration="25.4555529s" podCreationTimestamp="2026-03-10 15:24:46 +0000 UTC" firstStartedPulling="2026-03-10 15:25:02.740569783 +0000 UTC m=+1135.906310681" lastFinishedPulling="2026-03-10 15:25:10.54626168 +0000 UTC m=+1143.712002578" observedRunningTime="2026-03-10 15:25:11.435085835 +0000 UTC m=+1144.600826733" watchObservedRunningTime="2026-03-10 15:25:11.4555529 +0000 UTC m=+1144.621293828" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.473688 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-sblqq" podStartSLOduration=24.50297772 podStartE2EDuration="28.473665587s" podCreationTimestamp="2026-03-10 15:24:43 +0000 UTC" firstStartedPulling="2026-03-10 15:25:02.738109793 +0000 UTC m=+1135.903850691" lastFinishedPulling="2026-03-10 15:25:06.70879764 +0000 UTC m=+1139.874538558" observedRunningTime="2026-03-10 15:25:11.464263469 +0000 UTC m=+1144.630004377" watchObservedRunningTime="2026-03-10 15:25:11.473665587 +0000 UTC m=+1144.639406485" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.517748 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.522407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-czckz"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.874901 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.908244 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:11 crc kubenswrapper[4795]: I0310 15:25:11.952788 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.092491 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.099424 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.101255 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.102025 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.102267 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-dlrw2" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.102488 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.116977 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.254443 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612389f7-00cb-49cc-9daf-a5e451d6312f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.254624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.254790 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-lock\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.254879 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96dfv\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-kube-api-access-96dfv\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.254931 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-cache\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.255093 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.356965 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-lock\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357044 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96dfv\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-kube-api-access-96dfv\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357091 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-cache\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357149 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357180 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612389f7-00cb-49cc-9daf-a5e451d6312f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.357369 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.357386 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.357427 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:12.857409587 +0000 UTC m=+1146.023150495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357542 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-lock\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.357645 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/612389f7-00cb-49cc-9daf-a5e451d6312f-cache\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.373017 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612389f7-00cb-49cc-9daf-a5e451d6312f-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.373460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96dfv\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-kube-api-access-96dfv\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.385424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.413941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" event={"ID":"e3270be3-9263-4d42-bcfc-f7cfbed584e1","Type":"ContainerStarted","Data":"d8d9c4cfa2240e13485ffe4255df1e09c9d8bb4bcc05601bfcea344c6620e6b2"} Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.414211 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.454690 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.481104 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.521934 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.591952 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-nwz6d"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.595140 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.596925 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.597355 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.597370 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.610456 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nwz6d"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.693598 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.717784 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.719466 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.728232 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.728436 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.761824 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khmj9\" (UniqueName: \"kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.761873 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.761919 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.761982 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.762006 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.762124 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.762163 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.808443 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-j7t9w"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.809495 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.840712 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.851644 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j7t9w"] Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863316 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863346 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863504 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khmj9\" (UniqueName: \"kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863524 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863568 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.863600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr88w\" (UniqueName: \"kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.864631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.865457 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.865476 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: E0310 15:25:12.865563 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:13.865544735 +0000 UTC m=+1147.031285693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.873134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.873235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.879383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.881321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.892661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.896863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khmj9\" (UniqueName: \"kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9\") pod \"swift-ring-rebalance-nwz6d\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.918684 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.965776 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.968002 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.969995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970476 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-combined-ca-bundle\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970588 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970603 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970719 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovs-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr88w\" (UniqueName: \"kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfac213-1d19-4c6f-b88b-e7513792f2a1-config\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970848 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqrqg\" (UniqueName: \"kubernetes.io/projected/fbfac213-1d19-4c6f-b88b-e7513792f2a1-kube-api-access-qqrqg\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.970930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovn-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:12 crc kubenswrapper[4795]: I0310 15:25:12.972150 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:12.995397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.016752 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr88w\" (UniqueName: \"kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w\") pod \"dnsmasq-dns-57d65f699f-kltrc\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.035241 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.036617 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.039470 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.042733 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.067457 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.073928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqrqg\" (UniqueName: \"kubernetes.io/projected/fbfac213-1d19-4c6f-b88b-e7513792f2a1-kube-api-access-qqrqg\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.073975 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovn-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.074116 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-combined-ca-bundle\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.074165 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.074213 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovs-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.074239 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfac213-1d19-4c6f-b88b-e7513792f2a1-config\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.074900 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovn-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.075516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fbfac213-1d19-4c6f-b88b-e7513792f2a1-config\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.075604 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/fbfac213-1d19-4c6f-b88b-e7513792f2a1-ovs-rundir\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.087772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-combined-ca-bundle\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.088414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbfac213-1d19-4c6f-b88b-e7513792f2a1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.125510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqrqg\" (UniqueName: \"kubernetes.io/projected/fbfac213-1d19-4c6f-b88b-e7513792f2a1-kube-api-access-qqrqg\") pod \"ovn-controller-metrics-j7t9w\" (UID: \"fbfac213-1d19-4c6f-b88b-e7513792f2a1\") " pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.133277 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.177892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.177976 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.178008 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb59l\" (UniqueName: \"kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.178058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.178133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.263402 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-j7t9w" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.279557 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc\") pod \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.279691 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqvk2\" (UniqueName: \"kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2\") pod \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.279768 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config\") pod \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\" (UID: \"f54b4a29-0fff-4ce9-abcb-a14977f1d101\") " Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.279996 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.280040 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb59l\" (UniqueName: \"kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.280128 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.280207 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.280254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.281192 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.282655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.283232 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f54b4a29-0fff-4ce9-abcb-a14977f1d101" (UID: "f54b4a29-0fff-4ce9-abcb-a14977f1d101"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.284833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.286250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2" (OuterVolumeSpecName: "kube-api-access-vqvk2") pod "f54b4a29-0fff-4ce9-abcb-a14977f1d101" (UID: "f54b4a29-0fff-4ce9-abcb-a14977f1d101"). InnerVolumeSpecName "kube-api-access-vqvk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.289056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.296218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config" (OuterVolumeSpecName: "config") pod "f54b4a29-0fff-4ce9-abcb-a14977f1d101" (UID: "f54b4a29-0fff-4ce9-abcb-a14977f1d101"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.310831 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb59l\" (UniqueName: \"kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l\") pod \"dnsmasq-dns-b8fbc5445-p4wdl\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.381491 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.381528 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqvk2\" (UniqueName: \"kubernetes.io/projected/f54b4a29-0fff-4ce9-abcb-a14977f1d101-kube-api-access-vqvk2\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.381542 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f54b4a29-0fff-4ce9-abcb-a14977f1d101-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.426150 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" event={"ID":"f54b4a29-0fff-4ce9-abcb-a14977f1d101","Type":"ContainerDied","Data":"cdc3df8887c9725ff3cad17e97533e6f9955f1236cd59c36ebc492edbd8f09e0"} Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.427422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.427844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-krr6z" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.428962 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.516432 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2051a93f-4a2a-4d8a-b7c3-f4d24791dc00" path="/var/lib/kubelet/pods/2051a93f-4a2a-4d8a-b7c3-f4d24791dc00/volumes" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.517150 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.537056 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.545146 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-krr6z"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.584528 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-nwz6d"] Mar 10 15:25:13 crc kubenswrapper[4795]: W0310 15:25:13.599470 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3379cef_da13_42d0_80b5_1600bbde9f95.slice/crio-2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156 WatchSource:0}: Error finding container 2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156: Status 404 returned error can't find the container with id 2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156 Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.681642 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:13 crc kubenswrapper[4795]: W0310 15:25:13.692230 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ab1e80_e9e2_4dad_a77f_66fd09e56dc1.slice/crio-1aa199b7788eb3b772954c527c5a82c19423eec3d16fd37e4dbe9d1a3cd95fa8 WatchSource:0}: Error finding container 1aa199b7788eb3b772954c527c5a82c19423eec3d16fd37e4dbe9d1a3cd95fa8: Status 404 returned error can't find the container with id 1aa199b7788eb3b772954c527c5a82c19423eec3d16fd37e4dbe9d1a3cd95fa8 Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.770174 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.771570 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.774158 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.774339 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.774451 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-gmfkv" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.774501 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.775308 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:13 crc kubenswrapper[4795]: W0310 15:25:13.806023 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbfac213_1d19_4c6f_b88b_e7513792f2a1.slice/crio-a0a7e87cf540e8bd91d4009362300ac0c29be0345c30cf527b3deda64928bbb1 WatchSource:0}: Error finding container a0a7e87cf540e8bd91d4009362300ac0c29be0345c30cf527b3deda64928bbb1: Status 404 returned error can't find the container with id a0a7e87cf540e8bd91d4009362300ac0c29be0345c30cf527b3deda64928bbb1 Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.814175 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-j7t9w"] Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890716 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-scripts\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.890991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-config\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.891040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z2kh\" (UniqueName: \"kubernetes.io/projected/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-kube-api-access-7z2kh\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: E0310 15:25:13.891255 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:13 crc kubenswrapper[4795]: E0310 15:25:13.891293 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:13 crc kubenswrapper[4795]: E0310 15:25:13.891359 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:15.891330953 +0000 UTC m=+1149.057072021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.945160 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:25:13 crc kubenswrapper[4795]: W0310 15:25:13.952432 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2b540d3_b27e_4bf6_922c_89fef3c66955.slice/crio-47fb011077be883cc729d7e19cb532aecab6134e5708bda4cc43d8e41855fbdd WatchSource:0}: Error finding container 47fb011077be883cc729d7e19cb532aecab6134e5708bda4cc43d8e41855fbdd: Status 404 returned error can't find the container with id 47fb011077be883cc729d7e19cb532aecab6134e5708bda4cc43d8e41855fbdd Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993338 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-config\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z2kh\" (UniqueName: \"kubernetes.io/projected/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-kube-api-access-7z2kh\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993434 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.993483 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-scripts\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.994182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.994840 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-scripts\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.995955 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-config\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:13 crc kubenswrapper[4795]: I0310 15:25:13.998321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:13.999603 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:13.999713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.010437 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z2kh\" (UniqueName: \"kubernetes.io/projected/f8be8f3c-b4c1-41dc-99bf-3950c21ce504-kube-api-access-7z2kh\") pod \"ovn-northd-0\" (UID: \"f8be8f3c-b4c1-41dc-99bf-3950c21ce504\") " pod="openstack/ovn-northd-0" Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.088655 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.435493 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" event={"ID":"e2b540d3-b27e-4bf6-922c-89fef3c66955","Type":"ContainerStarted","Data":"47fb011077be883cc729d7e19cb532aecab6134e5708bda4cc43d8e41855fbdd"} Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.436645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" event={"ID":"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1","Type":"ContainerStarted","Data":"1aa199b7788eb3b772954c527c5a82c19423eec3d16fd37e4dbe9d1a3cd95fa8"} Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.437774 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j7t9w" event={"ID":"fbfac213-1d19-4c6f-b88b-e7513792f2a1","Type":"ContainerStarted","Data":"a0a7e87cf540e8bd91d4009362300ac0c29be0345c30cf527b3deda64928bbb1"} Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.447395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwz6d" event={"ID":"c3379cef-da13-42d0-80b5-1600bbde9f95","Type":"ContainerStarted","Data":"2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156"} Mar 10 15:25:14 crc kubenswrapper[4795]: I0310 15:25:14.530986 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 10 15:25:15 crc kubenswrapper[4795]: I0310 15:25:15.455040 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f8be8f3c-b4c1-41dc-99bf-3950c21ce504","Type":"ContainerStarted","Data":"782131ec3988ca8cb2ecad456b0544492e26d6a98b219210b0619616b2d2e15f"} Mar 10 15:25:15 crc kubenswrapper[4795]: I0310 15:25:15.485901 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f54b4a29-0fff-4ce9-abcb-a14977f1d101" path="/var/lib/kubelet/pods/f54b4a29-0fff-4ce9-abcb-a14977f1d101/volumes" Mar 10 15:25:15 crc kubenswrapper[4795]: I0310 15:25:15.925910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:15 crc kubenswrapper[4795]: E0310 15:25:15.926096 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:15 crc kubenswrapper[4795]: E0310 15:25:15.926325 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:15 crc kubenswrapper[4795]: E0310 15:25:15.926375 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:19.926359786 +0000 UTC m=+1153.092100684 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:17 crc kubenswrapper[4795]: I0310 15:25:17.256835 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 10 15:25:17 crc kubenswrapper[4795]: I0310 15:25:17.256909 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 10 15:25:17 crc kubenswrapper[4795]: I0310 15:25:17.338812 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 10 15:25:17 crc kubenswrapper[4795]: I0310 15:25:17.579037 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.481298 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-j7t9w" event={"ID":"fbfac213-1d19-4c6f-b88b-e7513792f2a1","Type":"ContainerStarted","Data":"83a2d743652c08b902678e86182a9658b659bd57d35872f84b5aaf73e5c65c3f"} Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.486350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerStarted","Data":"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85"} Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.520685 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-j7t9w" podStartSLOduration=6.520658178 podStartE2EDuration="6.520658178s" podCreationTimestamp="2026-03-10 15:25:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:18.499629287 +0000 UTC m=+1151.665370185" watchObservedRunningTime="2026-03-10 15:25:18.520658178 +0000 UTC m=+1151.686399096" Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.639693 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.784309 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.958132 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-68gqp"] Mar 10 15:25:18 crc kubenswrapper[4795]: I0310 15:25:18.959372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:18.996269 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-68gqp"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.006284 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6b7b-account-create-update-2mtzv"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.007348 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.010831 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.041250 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6b7b-account-create-update-2mtzv"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.077635 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9z2\" (UniqueName: \"kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.077930 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.179516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.179665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.179733 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9z2\" (UniqueName: \"kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.179763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjkr\" (UniqueName: \"kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.180751 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.196383 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9z2\" (UniqueName: \"kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2\") pod \"glance-db-create-68gqp\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.280752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.281194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjkr\" (UniqueName: \"kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.281522 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.297225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjkr\" (UniqueName: \"kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr\") pod \"glance-6b7b-account-create-update-2mtzv\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.343357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-68gqp" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.371421 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.513942 4795 generic.go:334] "Generic (PLEG): container finished" podID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerID="fe1b8ecbb879fe289a13b411e19fdefc6599d9ca3d770da7d54458ae0396b35e" exitCode=0 Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.514025 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" event={"ID":"e2b540d3-b27e-4bf6-922c-89fef3c66955","Type":"ContainerDied","Data":"fe1b8ecbb879fe289a13b411e19fdefc6599d9ca3d770da7d54458ae0396b35e"} Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.519276 4795 generic.go:334] "Generic (PLEG): container finished" podID="e3270be3-9263-4d42-bcfc-f7cfbed584e1" containerID="2d04dfd9e98074883701c9ac066a4b3b213c01b176db26cfdc1407e456cb60e8" exitCode=0 Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.519338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" event={"ID":"e3270be3-9263-4d42-bcfc-f7cfbed584e1","Type":"ContainerDied","Data":"2d04dfd9e98074883701c9ac066a4b3b213c01b176db26cfdc1407e456cb60e8"} Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.522594 4795 generic.go:334] "Generic (PLEG): container finished" podID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerID="916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8" exitCode=0 Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.525038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" event={"ID":"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1","Type":"ContainerDied","Data":"916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8"} Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.715660 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4nvx5"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.716937 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.727663 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4nvx5"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.799168 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-280d-account-create-update-l8cfw"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.800647 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.802505 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.811483 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-280d-account-create-update-l8cfw"] Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.893647 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.893739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.893765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrz9j\" (UniqueName: \"kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.893848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424zn\" (UniqueName: \"kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.995282 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.995373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.995406 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrz9j\" (UniqueName: \"kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.995515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.995562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424zn\" (UniqueName: \"kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.996533 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:19 crc kubenswrapper[4795]: I0310 15:25:19.996670 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:19 crc kubenswrapper[4795]: E0310 15:25:19.996691 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:19 crc kubenswrapper[4795]: E0310 15:25:19.996738 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:19 crc kubenswrapper[4795]: E0310 15:25:19.996798 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:27.996776173 +0000 UTC m=+1161.162517121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.002937 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-wrt7x"] Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.004369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.015104 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-751f-account-create-update-bngwt"] Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.016120 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.018396 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.024842 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wrt7x"] Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.028217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424zn\" (UniqueName: \"kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn\") pod \"keystone-db-create-4nvx5\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.028217 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrz9j\" (UniqueName: \"kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j\") pod \"keystone-280d-account-create-update-l8cfw\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.034027 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-751f-account-create-update-bngwt"] Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.042581 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.097204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.097245 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j94jj\" (UniqueName: \"kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.152974 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.198800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.199047 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.199100 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j94jj\" (UniqueName: \"kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.199238 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdwcc\" (UniqueName: \"kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.199965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.226357 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j94jj\" (UniqueName: \"kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj\") pod \"placement-db-create-wrt7x\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.300516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdwcc\" (UniqueName: \"kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.300579 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.301210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.326548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdwcc\" (UniqueName: \"kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc\") pod \"placement-751f-account-create-update-bngwt\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.336415 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.380871 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.851540 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:20 crc kubenswrapper[4795]: I0310 15:25:20.915197 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.014159 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc\") pod \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.014302 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config\") pod \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.014345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt4wk\" (UniqueName: \"kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk\") pod \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\" (UID: \"e3270be3-9263-4d42-bcfc-f7cfbed584e1\") " Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.020624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk" (OuterVolumeSpecName: "kube-api-access-wt4wk") pod "e3270be3-9263-4d42-bcfc-f7cfbed584e1" (UID: "e3270be3-9263-4d42-bcfc-f7cfbed584e1"). InnerVolumeSpecName "kube-api-access-wt4wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.035124 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config" (OuterVolumeSpecName: "config") pod "e3270be3-9263-4d42-bcfc-f7cfbed584e1" (UID: "e3270be3-9263-4d42-bcfc-f7cfbed584e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.035976 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e3270be3-9263-4d42-bcfc-f7cfbed584e1" (UID: "e3270be3-9263-4d42-bcfc-f7cfbed584e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.115621 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.115654 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt4wk\" (UniqueName: \"kubernetes.io/projected/e3270be3-9263-4d42-bcfc-f7cfbed584e1-kube-api-access-wt4wk\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.115664 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e3270be3-9263-4d42-bcfc-f7cfbed584e1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.540171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" event={"ID":"e3270be3-9263-4d42-bcfc-f7cfbed584e1","Type":"ContainerDied","Data":"d8d9c4cfa2240e13485ffe4255df1e09c9d8bb4bcc05601bfcea344c6620e6b2"} Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.540205 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-84fmc" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.540234 4795 scope.go:117] "RemoveContainer" containerID="2d04dfd9e98074883701c9ac066a4b3b213c01b176db26cfdc1407e456cb60e8" Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.584588 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:21 crc kubenswrapper[4795]: I0310 15:25:21.591033 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-84fmc"] Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.344429 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-68gqp"] Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.558749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" event={"ID":"e2b540d3-b27e-4bf6-922c-89fef3c66955","Type":"ContainerStarted","Data":"c53b8d09e84e7e4cdb96ebd3e0ade02887c0bce0356a6a5e016382d13f4fdfe5"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.559723 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.566060 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-68gqp" event={"ID":"4f3d684b-c185-4a2b-a739-a40d239bc661","Type":"ContainerStarted","Data":"b7de940adacd66cc1f65d87748aca94c04d53704d993b9ff4465b4e3b00aacb9"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.566126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-68gqp" event={"ID":"4f3d684b-c185-4a2b-a739-a40d239bc661","Type":"ContainerStarted","Data":"2d4b808338e797cf83d4e27f9cd8a0310c5952c4ff9703d03a28f32aacdcb31f"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.573227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f8be8f3c-b4c1-41dc-99bf-3950c21ce504","Type":"ContainerStarted","Data":"f89f31df132d61f2c116ac4ef93e635b543832c99999485808131fb73f005c48"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.584398 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" podStartSLOduration=5.903750612 podStartE2EDuration="10.584376343s" podCreationTimestamp="2026-03-10 15:25:12 +0000 UTC" firstStartedPulling="2026-03-10 15:25:13.955183227 +0000 UTC m=+1147.120924125" lastFinishedPulling="2026-03-10 15:25:18.635808958 +0000 UTC m=+1151.801549856" observedRunningTime="2026-03-10 15:25:22.582385196 +0000 UTC m=+1155.748126104" watchObservedRunningTime="2026-03-10 15:25:22.584376343 +0000 UTC m=+1155.750117251" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.597049 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" event={"ID":"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1","Type":"ContainerStarted","Data":"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.597842 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.615841 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-68gqp" podStartSLOduration=4.615819951 podStartE2EDuration="4.615819951s" podCreationTimestamp="2026-03-10 15:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:22.598383593 +0000 UTC m=+1155.764124491" watchObservedRunningTime="2026-03-10 15:25:22.615819951 +0000 UTC m=+1155.781560849" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.621905 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" podStartSLOduration=5.683812909 podStartE2EDuration="10.621886925s" podCreationTimestamp="2026-03-10 15:25:12 +0000 UTC" firstStartedPulling="2026-03-10 15:25:13.694222352 +0000 UTC m=+1146.859963250" lastFinishedPulling="2026-03-10 15:25:18.632296368 +0000 UTC m=+1151.798037266" observedRunningTime="2026-03-10 15:25:22.616529292 +0000 UTC m=+1155.782270190" watchObservedRunningTime="2026-03-10 15:25:22.621886925 +0000 UTC m=+1155.787627823" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.622733 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwz6d" event={"ID":"c3379cef-da13-42d0-80b5-1600bbde9f95","Type":"ContainerStarted","Data":"8237646eae9e38784aabb04c2c0883bc0ce94bd84e2ca7bbf50caba4b7e45053"} Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.653563 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-nwz6d" podStartSLOduration=2.205126779 podStartE2EDuration="10.653548869s" podCreationTimestamp="2026-03-10 15:25:12 +0000 UTC" firstStartedPulling="2026-03-10 15:25:13.601203534 +0000 UTC m=+1146.766944432" lastFinishedPulling="2026-03-10 15:25:22.049625624 +0000 UTC m=+1155.215366522" observedRunningTime="2026-03-10 15:25:22.644879462 +0000 UTC m=+1155.810620360" watchObservedRunningTime="2026-03-10 15:25:22.653548869 +0000 UTC m=+1155.819289767" Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.682160 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6b7b-account-create-update-2mtzv"] Mar 10 15:25:22 crc kubenswrapper[4795]: W0310 15:25:22.686654 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda52410d0_e4ef_4d3c_849f_be86e8e4333d.slice/crio-d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8 WatchSource:0}: Error finding container d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8: Status 404 returned error can't find the container with id d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8 Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.695890 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-280d-account-create-update-l8cfw"] Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.702141 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-wrt7x"] Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.843304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4nvx5"] Mar 10 15:25:22 crc kubenswrapper[4795]: I0310 15:25:22.849128 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-751f-account-create-update-bngwt"] Mar 10 15:25:22 crc kubenswrapper[4795]: W0310 15:25:22.856353 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5e420f1_ef1a_4069_b328_caa602467dff.slice/crio-90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604 WatchSource:0}: Error finding container 90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604: Status 404 returned error can't find the container with id 90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.488172 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3270be3-9263-4d42-bcfc-f7cfbed584e1" path="/var/lib/kubelet/pods/e3270be3-9263-4d42-bcfc-f7cfbed584e1/volumes" Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.635053 4795 generic.go:334] "Generic (PLEG): container finished" podID="4f3d684b-c185-4a2b-a739-a40d239bc661" containerID="b7de940adacd66cc1f65d87748aca94c04d53704d993b9ff4465b4e3b00aacb9" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.635218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-68gqp" event={"ID":"4f3d684b-c185-4a2b-a739-a40d239bc661","Type":"ContainerDied","Data":"b7de940adacd66cc1f65d87748aca94c04d53704d993b9ff4465b4e3b00aacb9"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.638407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f8be8f3c-b4c1-41dc-99bf-3950c21ce504","Type":"ContainerStarted","Data":"f0d804cb5e4feb2347191962151614710e16566ab99c0a40ab2238dda05e7a22"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.638616 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.641103 4795 generic.go:334] "Generic (PLEG): container finished" podID="84099369-9f74-4a3a-baca-2a83bb833639" containerID="176afdd3bc3ab97589f8c5d1929bb8f28e42e3d0a14c757e17e490e339c9bee7" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.641202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4nvx5" event={"ID":"84099369-9f74-4a3a-baca-2a83bb833639","Type":"ContainerDied","Data":"176afdd3bc3ab97589f8c5d1929bb8f28e42e3d0a14c757e17e490e339c9bee7"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.641288 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4nvx5" event={"ID":"84099369-9f74-4a3a-baca-2a83bb833639","Type":"ContainerStarted","Data":"0612547a878cfe2e913d7bc74d5f42034fefb03039509f9c575bf00fceb46e64"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.645451 4795 generic.go:334] "Generic (PLEG): container finished" podID="06792de7-f343-4a3f-ad0b-66577ab2aca6" containerID="3be0320b536cdf2bc0798c16517491d4a92ebac4beca8079c069ba5b805f8075" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.645572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wrt7x" event={"ID":"06792de7-f343-4a3f-ad0b-66577ab2aca6","Type":"ContainerDied","Data":"3be0320b536cdf2bc0798c16517491d4a92ebac4beca8079c069ba5b805f8075"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.645942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wrt7x" event={"ID":"06792de7-f343-4a3f-ad0b-66577ab2aca6","Type":"ContainerStarted","Data":"6a3c45ffc78735187b7d616ea8abc302aa7517cdfcf0fb3fc42a4ece001f0bb6"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.648050 4795 generic.go:334] "Generic (PLEG): container finished" podID="d5e420f1-ef1a-4069-b328-caa602467dff" containerID="88d260b87ff73e121e6f1c9a7909298430d8a87a7f2544d4f0366a925f2f6acd" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.648154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-751f-account-create-update-bngwt" event={"ID":"d5e420f1-ef1a-4069-b328-caa602467dff","Type":"ContainerDied","Data":"88d260b87ff73e121e6f1c9a7909298430d8a87a7f2544d4f0366a925f2f6acd"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.648194 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-751f-account-create-update-bngwt" event={"ID":"d5e420f1-ef1a-4069-b328-caa602467dff","Type":"ContainerStarted","Data":"90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.651250 4795 generic.go:334] "Generic (PLEG): container finished" podID="dcc47465-6390-4daf-981d-477efb47c502" containerID="09744bb1fd442df92c8a1cb70ae01868e9b50693a0f91d0ef495f07c95af86b8" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.651434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b7b-account-create-update-2mtzv" event={"ID":"dcc47465-6390-4daf-981d-477efb47c502","Type":"ContainerDied","Data":"09744bb1fd442df92c8a1cb70ae01868e9b50693a0f91d0ef495f07c95af86b8"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.651474 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b7b-account-create-update-2mtzv" event={"ID":"dcc47465-6390-4daf-981d-477efb47c502","Type":"ContainerStarted","Data":"5beca07eed9b964554a9715773b4c8bf882df65eb17a13288de4335cc9c1aabd"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.660018 4795 generic.go:334] "Generic (PLEG): container finished" podID="a52410d0-e4ef-4d3c-849f-be86e8e4333d" containerID="60dda952fab40ff5b66b3707749e6cf3fc95a18b7aaa9beefb8ff20d28d595ed" exitCode=0 Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.661244 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-280d-account-create-update-l8cfw" event={"ID":"a52410d0-e4ef-4d3c-849f-be86e8e4333d","Type":"ContainerDied","Data":"60dda952fab40ff5b66b3707749e6cf3fc95a18b7aaa9beefb8ff20d28d595ed"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.661331 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-280d-account-create-update-l8cfw" event={"ID":"a52410d0-e4ef-4d3c-849f-be86e8e4333d","Type":"ContainerStarted","Data":"d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8"} Mar 10 15:25:23 crc kubenswrapper[4795]: I0310 15:25:23.719576 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.296362938 podStartE2EDuration="10.719395412s" podCreationTimestamp="2026-03-10 15:25:13 +0000 UTC" firstStartedPulling="2026-03-10 15:25:14.530656439 +0000 UTC m=+1147.696397327" lastFinishedPulling="2026-03-10 15:25:21.953688903 +0000 UTC m=+1155.119429801" observedRunningTime="2026-03-10 15:25:23.700053489 +0000 UTC m=+1156.865794427" watchObservedRunningTime="2026-03-10 15:25:23.719395412 +0000 UTC m=+1156.885136320" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.104384 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.207498 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrz9j\" (UniqueName: \"kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j\") pod \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.207673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts\") pod \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\" (UID: \"a52410d0-e4ef-4d3c-849f-be86e8e4333d\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.208825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a52410d0-e4ef-4d3c-849f-be86e8e4333d" (UID: "a52410d0-e4ef-4d3c-849f-be86e8e4333d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.213745 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j" (OuterVolumeSpecName: "kube-api-access-xrz9j") pod "a52410d0-e4ef-4d3c-849f-be86e8e4333d" (UID: "a52410d0-e4ef-4d3c-849f-be86e8e4333d"). InnerVolumeSpecName "kube-api-access-xrz9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.264607 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.268840 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.273358 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.284339 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-68gqp" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.293743 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.309773 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a52410d0-e4ef-4d3c-849f-be86e8e4333d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.309801 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrz9j\" (UniqueName: \"kubernetes.io/projected/a52410d0-e4ef-4d3c-849f-be86e8e4333d-kube-api-access-xrz9j\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410319 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbjkr\" (UniqueName: \"kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr\") pod \"dcc47465-6390-4daf-981d-477efb47c502\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts\") pod \"dcc47465-6390-4daf-981d-477efb47c502\" (UID: \"dcc47465-6390-4daf-981d-477efb47c502\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410476 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t9z2\" (UniqueName: \"kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2\") pod \"4f3d684b-c185-4a2b-a739-a40d239bc661\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j94jj\" (UniqueName: \"kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj\") pod \"06792de7-f343-4a3f-ad0b-66577ab2aca6\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410523 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts\") pod \"84099369-9f74-4a3a-baca-2a83bb833639\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410555 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts\") pod \"d5e420f1-ef1a-4069-b328-caa602467dff\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410576 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts\") pod \"06792de7-f343-4a3f-ad0b-66577ab2aca6\" (UID: \"06792de7-f343-4a3f-ad0b-66577ab2aca6\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410596 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdwcc\" (UniqueName: \"kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc\") pod \"d5e420f1-ef1a-4069-b328-caa602467dff\" (UID: \"d5e420f1-ef1a-4069-b328-caa602467dff\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410623 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-424zn\" (UniqueName: \"kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn\") pod \"84099369-9f74-4a3a-baca-2a83bb833639\" (UID: \"84099369-9f74-4a3a-baca-2a83bb833639\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410673 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts\") pod \"4f3d684b-c185-4a2b-a739-a40d239bc661\" (UID: \"4f3d684b-c185-4a2b-a739-a40d239bc661\") " Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.410845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcc47465-6390-4daf-981d-477efb47c502" (UID: "dcc47465-6390-4daf-981d-477efb47c502"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411046 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5e420f1-ef1a-4069-b328-caa602467dff" (UID: "d5e420f1-ef1a-4069-b328-caa602467dff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411241 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc47465-6390-4daf-981d-477efb47c502-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411263 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5e420f1-ef1a-4069-b328-caa602467dff-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411396 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f3d684b-c185-4a2b-a739-a40d239bc661" (UID: "4f3d684b-c185-4a2b-a739-a40d239bc661"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411437 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "84099369-9f74-4a3a-baca-2a83bb833639" (UID: "84099369-9f74-4a3a-baca-2a83bb833639"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.411755 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06792de7-f343-4a3f-ad0b-66577ab2aca6" (UID: "06792de7-f343-4a3f-ad0b-66577ab2aca6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.413076 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr" (OuterVolumeSpecName: "kube-api-access-tbjkr") pod "dcc47465-6390-4daf-981d-477efb47c502" (UID: "dcc47465-6390-4daf-981d-477efb47c502"). InnerVolumeSpecName "kube-api-access-tbjkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.413489 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn" (OuterVolumeSpecName: "kube-api-access-424zn") pod "84099369-9f74-4a3a-baca-2a83bb833639" (UID: "84099369-9f74-4a3a-baca-2a83bb833639"). InnerVolumeSpecName "kube-api-access-424zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.413882 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2" (OuterVolumeSpecName: "kube-api-access-9t9z2") pod "4f3d684b-c185-4a2b-a739-a40d239bc661" (UID: "4f3d684b-c185-4a2b-a739-a40d239bc661"). InnerVolumeSpecName "kube-api-access-9t9z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.414458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj" (OuterVolumeSpecName: "kube-api-access-j94jj") pod "06792de7-f343-4a3f-ad0b-66577ab2aca6" (UID: "06792de7-f343-4a3f-ad0b-66577ab2aca6"). InnerVolumeSpecName "kube-api-access-j94jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.416219 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc" (OuterVolumeSpecName: "kube-api-access-tdwcc") pod "d5e420f1-ef1a-4069-b328-caa602467dff" (UID: "d5e420f1-ef1a-4069-b328-caa602467dff"). InnerVolumeSpecName "kube-api-access-tdwcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512441 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t9z2\" (UniqueName: \"kubernetes.io/projected/4f3d684b-c185-4a2b-a739-a40d239bc661-kube-api-access-9t9z2\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512476 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j94jj\" (UniqueName: \"kubernetes.io/projected/06792de7-f343-4a3f-ad0b-66577ab2aca6-kube-api-access-j94jj\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512490 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/84099369-9f74-4a3a-baca-2a83bb833639-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512502 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06792de7-f343-4a3f-ad0b-66577ab2aca6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512514 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdwcc\" (UniqueName: \"kubernetes.io/projected/d5e420f1-ef1a-4069-b328-caa602467dff-kube-api-access-tdwcc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512526 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-424zn\" (UniqueName: \"kubernetes.io/projected/84099369-9f74-4a3a-baca-2a83bb833639-kube-api-access-424zn\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512538 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f3d684b-c185-4a2b-a739-a40d239bc661-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.512549 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbjkr\" (UniqueName: \"kubernetes.io/projected/dcc47465-6390-4daf-981d-477efb47c502-kube-api-access-tbjkr\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.685462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6b7b-account-create-update-2mtzv" event={"ID":"dcc47465-6390-4daf-981d-477efb47c502","Type":"ContainerDied","Data":"5beca07eed9b964554a9715773b4c8bf882df65eb17a13288de4335cc9c1aabd"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.685523 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5beca07eed9b964554a9715773b4c8bf882df65eb17a13288de4335cc9c1aabd" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.685524 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6b7b-account-create-update-2mtzv" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.689181 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-280d-account-create-update-l8cfw" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.689228 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-280d-account-create-update-l8cfw" event={"ID":"a52410d0-e4ef-4d3c-849f-be86e8e4333d","Type":"ContainerDied","Data":"d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.689351 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54cf694c50188c6d1a13370cee2aaf4736f94aae3ccff06821d11bf543d47d8" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.692152 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-68gqp" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.692172 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-68gqp" event={"ID":"4f3d684b-c185-4a2b-a739-a40d239bc661","Type":"ContainerDied","Data":"2d4b808338e797cf83d4e27f9cd8a0310c5952c4ff9703d03a28f32aacdcb31f"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.692254 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d4b808338e797cf83d4e27f9cd8a0310c5952c4ff9703d03a28f32aacdcb31f" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.694301 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4nvx5" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.694326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4nvx5" event={"ID":"84099369-9f74-4a3a-baca-2a83bb833639","Type":"ContainerDied","Data":"0612547a878cfe2e913d7bc74d5f42034fefb03039509f9c575bf00fceb46e64"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.694362 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0612547a878cfe2e913d7bc74d5f42034fefb03039509f9c575bf00fceb46e64" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.696131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-wrt7x" event={"ID":"06792de7-f343-4a3f-ad0b-66577ab2aca6","Type":"ContainerDied","Data":"6a3c45ffc78735187b7d616ea8abc302aa7517cdfcf0fb3fc42a4ece001f0bb6"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.696161 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-wrt7x" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.696166 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a3c45ffc78735187b7d616ea8abc302aa7517cdfcf0fb3fc42a4ece001f0bb6" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.697804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-751f-account-create-update-bngwt" event={"ID":"d5e420f1-ef1a-4069-b328-caa602467dff","Type":"ContainerDied","Data":"90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604"} Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.697833 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-751f-account-create-update-bngwt" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.697840 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b93b0f7fef548476dc50e6c98bba629541f9fdbe0981bf8c3517fd30680604" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.840423 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j4nd7"] Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.840938 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc47465-6390-4daf-981d-477efb47c502" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc47465-6390-4daf-981d-477efb47c502" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841137 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a52410d0-e4ef-4d3c-849f-be86e8e4333d" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841196 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a52410d0-e4ef-4d3c-849f-be86e8e4333d" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841266 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3270be3-9263-4d42-bcfc-f7cfbed584e1" containerName="init" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841330 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3270be3-9263-4d42-bcfc-f7cfbed584e1" containerName="init" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841389 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e420f1-ef1a-4069-b328-caa602467dff" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841447 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e420f1-ef1a-4069-b328-caa602467dff" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841505 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06792de7-f343-4a3f-ad0b-66577ab2aca6" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841559 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="06792de7-f343-4a3f-ad0b-66577ab2aca6" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84099369-9f74-4a3a-baca-2a83bb833639" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841675 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="84099369-9f74-4a3a-baca-2a83bb833639" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: E0310 15:25:25.841729 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3d684b-c185-4a2b-a739-a40d239bc661" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.841783 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3d684b-c185-4a2b-a739-a40d239bc661" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842192 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a52410d0-e4ef-4d3c-849f-be86e8e4333d" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842284 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3d684b-c185-4a2b-a739-a40d239bc661" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842344 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3270be3-9263-4d42-bcfc-f7cfbed584e1" containerName="init" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842396 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc47465-6390-4daf-981d-477efb47c502" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842461 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e420f1-ef1a-4069-b328-caa602467dff" containerName="mariadb-account-create-update" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842516 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="06792de7-f343-4a3f-ad0b-66577ab2aca6" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.842680 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="84099369-9f74-4a3a-baca-2a83bb833639" containerName="mariadb-database-create" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.843227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.852455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.872244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j4nd7"] Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.926801 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:25 crc kubenswrapper[4795]: I0310 15:25:25.926890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz7b4\" (UniqueName: \"kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.028193 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz7b4\" (UniqueName: \"kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.028361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.029345 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.045638 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz7b4\" (UniqueName: \"kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4\") pod \"root-account-create-update-j4nd7\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.165150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:26 crc kubenswrapper[4795]: W0310 15:25:26.609807 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa9a9748_fc28_4830_9978_16fd1e36bac4.slice/crio-582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b WatchSource:0}: Error finding container 582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b: Status 404 returned error can't find the container with id 582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.623716 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j4nd7"] Mar 10 15:25:26 crc kubenswrapper[4795]: I0310 15:25:26.710855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j4nd7" event={"ID":"fa9a9748-fc28-4830-9978-16fd1e36bac4","Type":"ContainerStarted","Data":"582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b"} Mar 10 15:25:27 crc kubenswrapper[4795]: I0310 15:25:27.720082 4795 generic.go:334] "Generic (PLEG): container finished" podID="fa9a9748-fc28-4830-9978-16fd1e36bac4" containerID="feedcd326acfb71c4075d5a088bbd09fcec66b1c401a6407d73cf777ac68913f" exitCode=0 Mar 10 15:25:27 crc kubenswrapper[4795]: I0310 15:25:27.720129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j4nd7" event={"ID":"fa9a9748-fc28-4830-9978-16fd1e36bac4","Type":"ContainerDied","Data":"feedcd326acfb71c4075d5a088bbd09fcec66b1c401a6407d73cf777ac68913f"} Mar 10 15:25:28 crc kubenswrapper[4795]: I0310 15:25:28.042264 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:28 crc kubenswrapper[4795]: I0310 15:25:28.061126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:28 crc kubenswrapper[4795]: E0310 15:25:28.061442 4795 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 10 15:25:28 crc kubenswrapper[4795]: E0310 15:25:28.061472 4795 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 10 15:25:28 crc kubenswrapper[4795]: E0310 15:25:28.061524 4795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift podName:612389f7-00cb-49cc-9daf-a5e451d6312f nodeName:}" failed. No retries permitted until 2026-03-10 15:25:44.061505361 +0000 UTC m=+1177.227246269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift") pod "swift-storage-0" (UID: "612389f7-00cb-49cc-9daf-a5e451d6312f") : configmap "swift-ring-files" not found Mar 10 15:25:28 crc kubenswrapper[4795]: I0310 15:25:28.429309 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:25:28 crc kubenswrapper[4795]: I0310 15:25:28.492672 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:28 crc kubenswrapper[4795]: I0310 15:25:28.745922 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="dnsmasq-dns" containerID="cri-o://1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c" gracePeriod=10 Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.119476 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.190243 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-hr6mw"] Mar 10 15:25:29 crc kubenswrapper[4795]: E0310 15:25:29.190616 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9a9748-fc28-4830-9978-16fd1e36bac4" containerName="mariadb-account-create-update" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.190627 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9a9748-fc28-4830-9978-16fd1e36bac4" containerName="mariadb-account-create-update" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.190770 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9a9748-fc28-4830-9978-16fd1e36bac4" containerName="mariadb-account-create-update" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.191372 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.194285 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxn5s" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.194307 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.199378 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hr6mw"] Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.243798 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.286778 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts\") pod \"fa9a9748-fc28-4830-9978-16fd1e36bac4\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.286885 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nz7b4\" (UniqueName: \"kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4\") pod \"fa9a9748-fc28-4830-9978-16fd1e36bac4\" (UID: \"fa9a9748-fc28-4830-9978-16fd1e36bac4\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.287149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.287218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.287275 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hcvp\" (UniqueName: \"kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.287299 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.287728 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa9a9748-fc28-4830-9978-16fd1e36bac4" (UID: "fa9a9748-fc28-4830-9978-16fd1e36bac4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.292369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4" (OuterVolumeSpecName: "kube-api-access-nz7b4") pod "fa9a9748-fc28-4830-9978-16fd1e36bac4" (UID: "fa9a9748-fc28-4830-9978-16fd1e36bac4"). InnerVolumeSpecName "kube-api-access-nz7b4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.388660 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr88w\" (UniqueName: \"kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w\") pod \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.388747 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config\") pod \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.388907 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb\") pod \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.388960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc\") pod \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\" (UID: \"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1\") " Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389333 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389558 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hcvp\" (UniqueName: \"kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389599 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389711 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa9a9748-fc28-4830-9978-16fd1e36bac4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.389733 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nz7b4\" (UniqueName: \"kubernetes.io/projected/fa9a9748-fc28-4830-9978-16fd1e36bac4-kube-api-access-nz7b4\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.392375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w" (OuterVolumeSpecName: "kube-api-access-dr88w") pod "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" (UID: "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1"). InnerVolumeSpecName "kube-api-access-dr88w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.393885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.394166 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.397825 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.405395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hcvp\" (UniqueName: \"kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp\") pod \"glance-db-sync-hr6mw\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.426322 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config" (OuterVolumeSpecName: "config") pod "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" (UID: "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.426602 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" (UID: "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.433543 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" (UID: "63ab1e80-e9e2-4dad-a77f-66fd09e56dc1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.491294 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.491341 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.491362 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr88w\" (UniqueName: \"kubernetes.io/projected/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-kube-api-access-dr88w\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.491382 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.554592 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.753295 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j4nd7" event={"ID":"fa9a9748-fc28-4830-9978-16fd1e36bac4","Type":"ContainerDied","Data":"582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b"} Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.753514 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="582d744363eb87031a8356408f4009b8d72a0a165598508e1ae0b04f439f082b" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.753568 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j4nd7" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.756468 4795 generic.go:334] "Generic (PLEG): container finished" podID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerID="1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c" exitCode=0 Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.756505 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" event={"ID":"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1","Type":"ContainerDied","Data":"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c"} Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.756521 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" event={"ID":"63ab1e80-e9e2-4dad-a77f-66fd09e56dc1","Type":"ContainerDied","Data":"1aa199b7788eb3b772954c527c5a82c19423eec3d16fd37e4dbe9d1a3cd95fa8"} Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.756536 4795 scope.go:117] "RemoveContainer" containerID="1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.756621 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d65f699f-kltrc" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.758487 4795 generic.go:334] "Generic (PLEG): container finished" podID="c3379cef-da13-42d0-80b5-1600bbde9f95" containerID="8237646eae9e38784aabb04c2c0883bc0ce94bd84e2ca7bbf50caba4b7e45053" exitCode=0 Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.758518 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwz6d" event={"ID":"c3379cef-da13-42d0-80b5-1600bbde9f95","Type":"ContainerDied","Data":"8237646eae9e38784aabb04c2c0883bc0ce94bd84e2ca7bbf50caba4b7e45053"} Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.785727 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.786812 4795 scope.go:117] "RemoveContainer" containerID="916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.815493 4795 scope.go:117] "RemoveContainer" containerID="1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c" Mar 10 15:25:29 crc kubenswrapper[4795]: E0310 15:25:29.816527 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c\": container with ID starting with 1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c not found: ID does not exist" containerID="1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.816554 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c"} err="failed to get container status \"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c\": rpc error: code = NotFound desc = could not find container \"1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c\": container with ID starting with 1afce5aa2efccfd982986b9006d1c4895e271ca3ad003ba204f7368a0e65970c not found: ID does not exist" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.816575 4795 scope.go:117] "RemoveContainer" containerID="916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8" Mar 10 15:25:29 crc kubenswrapper[4795]: E0310 15:25:29.816921 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8\": container with ID starting with 916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8 not found: ID does not exist" containerID="916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.816978 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8"} err="failed to get container status \"916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8\": rpc error: code = NotFound desc = could not find container \"916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8\": container with ID starting with 916dfe839844d94f0d3a2c0647e31a4d42449e98b6dbf5555b86971f42e2b5b8 not found: ID does not exist" Mar 10 15:25:29 crc kubenswrapper[4795]: I0310 15:25:29.818744 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d65f699f-kltrc"] Mar 10 15:25:30 crc kubenswrapper[4795]: I0310 15:25:30.126930 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-hr6mw"] Mar 10 15:25:30 crc kubenswrapper[4795]: I0310 15:25:30.768257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hr6mw" event={"ID":"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994","Type":"ContainerStarted","Data":"014d3c3b812f0891893d304c207d897b9a97214dcd77136754e2a623c34fdb44"} Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.097887 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.223995 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224184 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khmj9\" (UniqueName: \"kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.224383 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift\") pod \"c3379cef-da13-42d0-80b5-1600bbde9f95\" (UID: \"c3379cef-da13-42d0-80b5-1600bbde9f95\") " Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.225355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.225432 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.231846 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9" (OuterVolumeSpecName: "kube-api-access-khmj9") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "kube-api-access-khmj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.232199 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.247327 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts" (OuterVolumeSpecName: "scripts") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.250729 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.256890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c3379cef-da13-42d0-80b5-1600bbde9f95" (UID: "c3379cef-da13-42d0-80b5-1600bbde9f95"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326388 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326715 4795 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c3379cef-da13-42d0-80b5-1600bbde9f95-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326725 4795 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326736 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326746 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khmj9\" (UniqueName: \"kubernetes.io/projected/c3379cef-da13-42d0-80b5-1600bbde9f95-kube-api-access-khmj9\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326755 4795 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c3379cef-da13-42d0-80b5-1600bbde9f95-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.326764 4795 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c3379cef-da13-42d0-80b5-1600bbde9f95-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.487776 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" path="/var/lib/kubelet/pods/63ab1e80-e9e2-4dad-a77f-66fd09e56dc1/volumes" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.790747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-nwz6d" event={"ID":"c3379cef-da13-42d0-80b5-1600bbde9f95","Type":"ContainerDied","Data":"2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156"} Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.790789 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2aa87d1e9c370aead7b73da943725d565b31cf2d9f004f5b6fee027821b82156" Mar 10 15:25:31 crc kubenswrapper[4795]: I0310 15:25:31.790822 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-nwz6d" Mar 10 15:25:32 crc kubenswrapper[4795]: I0310 15:25:32.104932 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j4nd7"] Mar 10 15:25:32 crc kubenswrapper[4795]: I0310 15:25:32.111283 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j4nd7"] Mar 10 15:25:33 crc kubenswrapper[4795]: I0310 15:25:33.489183 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9a9748-fc28-4830-9978-16fd1e36bac4" path="/var/lib/kubelet/pods/fa9a9748-fc28-4830-9978-16fd1e36bac4/volumes" Mar 10 15:25:34 crc kubenswrapper[4795]: I0310 15:25:34.161567 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 10 15:25:36 crc kubenswrapper[4795]: I0310 15:25:36.844556 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerID="e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046" exitCode=0 Mar 10 15:25:36 crc kubenswrapper[4795]: I0310 15:25:36.844978 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerDied","Data":"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046"} Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.134467 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jn6h5"] Mar 10 15:25:37 crc kubenswrapper[4795]: E0310 15:25:37.135177 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="dnsmasq-dns" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.135233 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="dnsmasq-dns" Mar 10 15:25:37 crc kubenswrapper[4795]: E0310 15:25:37.135289 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3379cef-da13-42d0-80b5-1600bbde9f95" containerName="swift-ring-rebalance" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.135309 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3379cef-da13-42d0-80b5-1600bbde9f95" containerName="swift-ring-rebalance" Mar 10 15:25:37 crc kubenswrapper[4795]: E0310 15:25:37.135347 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="init" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.135365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="init" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.135788 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ab1e80-e9e2-4dad-a77f-66fd09e56dc1" containerName="dnsmasq-dns" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.135854 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3379cef-da13-42d0-80b5-1600bbde9f95" containerName="swift-ring-rebalance" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.136918 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.143764 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jn6h5"] Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.144207 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.427154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.427539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h74n6\" (UniqueName: \"kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.528602 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h74n6\" (UniqueName: \"kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.528868 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.530453 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.549768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h74n6\" (UniqueName: \"kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6\") pod \"root-account-create-update-jn6h5\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:37 crc kubenswrapper[4795]: I0310 15:25:37.777437 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:38 crc kubenswrapper[4795]: I0310 15:25:38.747823 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x9p5v" podUID="4823094a-6e7f-49da-9aa5-7d67b893896c" containerName="ovn-controller" probeResult="failure" output=< Mar 10 15:25:38 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 15:25:38 crc kubenswrapper[4795]: > Mar 10 15:25:38 crc kubenswrapper[4795]: I0310 15:25:38.771314 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.593745 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jn6h5"] Mar 10 15:25:41 crc kubenswrapper[4795]: W0310 15:25:41.608938 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80269433_7015_4f86_ac1e_81634478a4b4.slice/crio-a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d WatchSource:0}: Error finding container a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d: Status 404 returned error can't find the container with id a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.888015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerStarted","Data":"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e"} Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.888235 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.890152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hr6mw" event={"ID":"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994","Type":"ContainerStarted","Data":"57aa263732b323e9ae994e5fec3e55a8b13ef707bf2da626a94b6f6fd611c5cf"} Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.894045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jn6h5" event={"ID":"80269433-7015-4f86-ac1e-81634478a4b4","Type":"ContainerStarted","Data":"1ee644d2b01d0d8f676164c61c90a5e0951443a0f999632dc3cd42f51fb7ad22"} Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.894093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jn6h5" event={"ID":"80269433-7015-4f86-ac1e-81634478a4b4","Type":"ContainerStarted","Data":"a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d"} Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.924394 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.14660527 podStartE2EDuration="1m5.924371269s" podCreationTimestamp="2026-03-10 15:24:36 +0000 UTC" firstStartedPulling="2026-03-10 15:24:38.651386507 +0000 UTC m=+1111.817127405" lastFinishedPulling="2026-03-10 15:25:02.429152506 +0000 UTC m=+1135.594893404" observedRunningTime="2026-03-10 15:25:41.912346815 +0000 UTC m=+1175.078087713" watchObservedRunningTime="2026-03-10 15:25:41.924371269 +0000 UTC m=+1175.090112177" Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.949978 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-hr6mw" podStartSLOduration=1.9105892899999999 podStartE2EDuration="12.949959411s" podCreationTimestamp="2026-03-10 15:25:29 +0000 UTC" firstStartedPulling="2026-03-10 15:25:30.132230284 +0000 UTC m=+1163.297971192" lastFinishedPulling="2026-03-10 15:25:41.171600395 +0000 UTC m=+1174.337341313" observedRunningTime="2026-03-10 15:25:41.935155477 +0000 UTC m=+1175.100896385" watchObservedRunningTime="2026-03-10 15:25:41.949959411 +0000 UTC m=+1175.115700309" Mar 10 15:25:41 crc kubenswrapper[4795]: I0310 15:25:41.951120 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-jn6h5" podStartSLOduration=4.951113674 podStartE2EDuration="4.951113674s" podCreationTimestamp="2026-03-10 15:25:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:41.950361162 +0000 UTC m=+1175.116102070" watchObservedRunningTime="2026-03-10 15:25:41.951113674 +0000 UTC m=+1175.116854572" Mar 10 15:25:42 crc kubenswrapper[4795]: I0310 15:25:42.906656 4795 generic.go:334] "Generic (PLEG): container finished" podID="80269433-7015-4f86-ac1e-81634478a4b4" containerID="1ee644d2b01d0d8f676164c61c90a5e0951443a0f999632dc3cd42f51fb7ad22" exitCode=0 Mar 10 15:25:42 crc kubenswrapper[4795]: I0310 15:25:42.907275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jn6h5" event={"ID":"80269433-7015-4f86-ac1e-81634478a4b4","Type":"ContainerDied","Data":"1ee644d2b01d0d8f676164c61c90a5e0951443a0f999632dc3cd42f51fb7ad22"} Mar 10 15:25:43 crc kubenswrapper[4795]: I0310 15:25:43.756970 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x9p5v" podUID="4823094a-6e7f-49da-9aa5-7d67b893896c" containerName="ovn-controller" probeResult="failure" output=< Mar 10 15:25:43 crc kubenswrapper[4795]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 10 15:25:43 crc kubenswrapper[4795]: > Mar 10 15:25:43 crc kubenswrapper[4795]: I0310 15:25:43.770813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-sblqq" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.017166 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x9p5v-config-9tdk7"] Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.018422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.028489 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.081318 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v-config-9tdk7"] Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.136918 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137293 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137336 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4bf\" (UniqueName: \"kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137414 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.137509 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.145241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/612389f7-00cb-49cc-9daf-a5e451d6312f-etc-swift\") pod \"swift-storage-0\" (UID: \"612389f7-00cb-49cc-9daf-a5e451d6312f\") " pod="openstack/swift-storage-0" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.238995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4bf\" (UniqueName: \"kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239203 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239621 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.239698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.241613 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.241689 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.242276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.256856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4bf\" (UniqueName: \"kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf\") pod \"ovn-controller-x9p5v-config-9tdk7\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.287969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.339818 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.347637 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.441720 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h74n6\" (UniqueName: \"kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6\") pod \"80269433-7015-4f86-ac1e-81634478a4b4\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.442046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts\") pod \"80269433-7015-4f86-ac1e-81634478a4b4\" (UID: \"80269433-7015-4f86-ac1e-81634478a4b4\") " Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.442948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80269433-7015-4f86-ac1e-81634478a4b4" (UID: "80269433-7015-4f86-ac1e-81634478a4b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.446715 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6" (OuterVolumeSpecName: "kube-api-access-h74n6") pod "80269433-7015-4f86-ac1e-81634478a4b4" (UID: "80269433-7015-4f86-ac1e-81634478a4b4"). InnerVolumeSpecName "kube-api-access-h74n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.544114 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h74n6\" (UniqueName: \"kubernetes.io/projected/80269433-7015-4f86-ac1e-81634478a4b4-kube-api-access-h74n6\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.544150 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80269433-7015-4f86-ac1e-81634478a4b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.830563 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.903892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v-config-9tdk7"] Mar 10 15:25:44 crc kubenswrapper[4795]: W0310 15:25:44.906722 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c005c8b_a72d_4d36_a055_cd9d6097b91f.slice/crio-0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100 WatchSource:0}: Error finding container 0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100: Status 404 returned error can't find the container with id 0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100 Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.926899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jn6h5" event={"ID":"80269433-7015-4f86-ac1e-81634478a4b4","Type":"ContainerDied","Data":"a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d"} Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.926970 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2fa5e578cf959a909e3b82a5a4e46ec157f5652f8431df1fde80e6933c9121d" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.927089 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jn6h5" Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.933628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-9tdk7" event={"ID":"1c005c8b-a72d-4d36-a055-cd9d6097b91f","Type":"ContainerStarted","Data":"0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100"} Mar 10 15:25:44 crc kubenswrapper[4795]: I0310 15:25:44.937806 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"699fd7aea8ba7e9058477a8338233065ffa16cb6cff655aa8c92af53572f04b5"} Mar 10 15:25:45 crc kubenswrapper[4795]: I0310 15:25:45.948312 4795 generic.go:334] "Generic (PLEG): container finished" podID="1c005c8b-a72d-4d36-a055-cd9d6097b91f" containerID="53f0e6f7d8f377f6099f2b35e606b666a5c2ba24737f78a1a2dad32e126a9957" exitCode=0 Mar 10 15:25:45 crc kubenswrapper[4795]: I0310 15:25:45.948427 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-9tdk7" event={"ID":"1c005c8b-a72d-4d36-a055-cd9d6097b91f","Type":"ContainerDied","Data":"53f0e6f7d8f377f6099f2b35e606b666a5c2ba24737f78a1a2dad32e126a9957"} Mar 10 15:25:46 crc kubenswrapper[4795]: I0310 15:25:46.967738 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"bb8433d7b645c4a34db39a29c79a3981c02504e6a8c02f7441a4ba3b3742c94e"} Mar 10 15:25:46 crc kubenswrapper[4795]: I0310 15:25:46.968281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"5d2e225d7bdd78f63be2e88b64639a8b54807625423c0a7e69df17f36d52b7f9"} Mar 10 15:25:46 crc kubenswrapper[4795]: I0310 15:25:46.968293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"ed5c58f61308c6f0286ec860cc7df23596f809a04792a029b952adcc93bac51e"} Mar 10 15:25:46 crc kubenswrapper[4795]: I0310 15:25:46.968303 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"2d981794455ebdc2ef8479bdb2b04acbc7363a6e7d0be63d0584c4a83a2c1273"} Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.243212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.313774 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.313863 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.313921 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.313975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4bf\" (UniqueName: \"kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.314016 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.314086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run\") pod \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\" (UID: \"1c005c8b-a72d-4d36-a055-cd9d6097b91f\") " Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.314375 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run" (OuterVolumeSpecName: "var-run") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.314392 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.315273 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.315298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts" (OuterVolumeSpecName: "scripts") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.316083 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.326410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf" (OuterVolumeSpecName: "kube-api-access-4m4bf") pod "1c005c8b-a72d-4d36-a055-cd9d6097b91f" (UID: "1c005c8b-a72d-4d36-a055-cd9d6097b91f"). InnerVolumeSpecName "kube-api-access-4m4bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415927 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415957 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415968 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m4bf\" (UniqueName: \"kubernetes.io/projected/1c005c8b-a72d-4d36-a055-cd9d6097b91f-kube-api-access-4m4bf\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415978 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1c005c8b-a72d-4d36-a055-cd9d6097b91f-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415986 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.415994 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c005c8b-a72d-4d36-a055-cd9d6097b91f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.979458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-9tdk7" event={"ID":"1c005c8b-a72d-4d36-a055-cd9d6097b91f","Type":"ContainerDied","Data":"0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100"} Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.979498 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d9d950eb96388dea6935ca916c3000c0cf845b29ea756acca1bc4f7d7928100" Mar 10 15:25:47 crc kubenswrapper[4795]: I0310 15:25:47.979551 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-9tdk7" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.345501 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x9p5v-config-9tdk7"] Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.363304 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x9p5v-config-9tdk7"] Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.457172 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x9p5v-config-z59q5"] Mar 10 15:25:48 crc kubenswrapper[4795]: E0310 15:25:48.457905 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c005c8b-a72d-4d36-a055-cd9d6097b91f" containerName="ovn-config" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.457935 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c005c8b-a72d-4d36-a055-cd9d6097b91f" containerName="ovn-config" Mar 10 15:25:48 crc kubenswrapper[4795]: E0310 15:25:48.457970 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80269433-7015-4f86-ac1e-81634478a4b4" containerName="mariadb-account-create-update" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.457982 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="80269433-7015-4f86-ac1e-81634478a4b4" containerName="mariadb-account-create-update" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.458278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="80269433-7015-4f86-ac1e-81634478a4b4" containerName="mariadb-account-create-update" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.458313 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c005c8b-a72d-4d36-a055-cd9d6097b91f" containerName="ovn-config" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.459216 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.462760 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.467714 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v-config-z59q5"] Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532711 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532769 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d878t\" (UniqueName: \"kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532815 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532838 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.532946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.540019 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.540095 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.635877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d878t\" (UniqueName: \"kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636398 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636576 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636725 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636754 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.636250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.637423 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.638675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.656706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d878t\" (UniqueName: \"kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t\") pod \"ovn-controller-x9p5v-config-z59q5\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.759651 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-x9p5v" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.856737 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.992461 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"95bbb91f09407e47de9c875692342cc6b96b3403089aad667583ba585174ea2c"} Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.992832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"a0fdab01f42da2ca06c259d86baf87dce3949973b939b0bd1012115f4a28e324"} Mar 10 15:25:48 crc kubenswrapper[4795]: I0310 15:25:48.992843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"b195c12e9ef8750bbe4a478d86afc067e38464ae282fd46221ec000104a34d23"} Mar 10 15:25:49 crc kubenswrapper[4795]: I0310 15:25:49.331984 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x9p5v-config-z59q5"] Mar 10 15:25:49 crc kubenswrapper[4795]: W0310 15:25:49.334995 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82d4d466_35c9_4fc9_ba1a_951193d2ccac.slice/crio-ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344 WatchSource:0}: Error finding container ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344: Status 404 returned error can't find the container with id ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344 Mar 10 15:25:49 crc kubenswrapper[4795]: I0310 15:25:49.494000 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c005c8b-a72d-4d36-a055-cd9d6097b91f" path="/var/lib/kubelet/pods/1c005c8b-a72d-4d36-a055-cd9d6097b91f/volumes" Mar 10 15:25:50 crc kubenswrapper[4795]: I0310 15:25:50.009989 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-z59q5" event={"ID":"82d4d466-35c9-4fc9-ba1a-951193d2ccac","Type":"ContainerStarted","Data":"ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344"} Mar 10 15:25:51 crc kubenswrapper[4795]: I0310 15:25:51.025953 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerID="2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85" exitCode=0 Mar 10 15:25:51 crc kubenswrapper[4795]: I0310 15:25:51.026023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerDied","Data":"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85"} Mar 10 15:25:53 crc kubenswrapper[4795]: I0310 15:25:53.049264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"5beb98ea7e94b105e2a09c1bcec363fed5a612f392883bbd457b502d54236d36"} Mar 10 15:25:54 crc kubenswrapper[4795]: I0310 15:25:54.059425 4795 generic.go:334] "Generic (PLEG): container finished" podID="82d4d466-35c9-4fc9-ba1a-951193d2ccac" containerID="136554a0c840e18c932b5d01915abcb6c227714fdc2a18d7f1f6da7fbf6dc48a" exitCode=0 Mar 10 15:25:54 crc kubenswrapper[4795]: I0310 15:25:54.059483 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-z59q5" event={"ID":"82d4d466-35c9-4fc9-ba1a-951193d2ccac","Type":"ContainerDied","Data":"136554a0c840e18c932b5d01915abcb6c227714fdc2a18d7f1f6da7fbf6dc48a"} Mar 10 15:25:54 crc kubenswrapper[4795]: I0310 15:25:54.063227 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerStarted","Data":"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e"} Mar 10 15:25:54 crc kubenswrapper[4795]: I0310 15:25:54.063437 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:25:54 crc kubenswrapper[4795]: I0310 15:25:54.096576 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371957.758217 podStartE2EDuration="1m19.096558224s" podCreationTimestamp="2026-03-10 15:24:35 +0000 UTC" firstStartedPulling="2026-03-10 15:24:38.489730602 +0000 UTC m=+1111.655471500" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:25:54.094722751 +0000 UTC m=+1187.260463669" watchObservedRunningTime="2026-03-10 15:25:54.096558224 +0000 UTC m=+1187.262299122" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.077752 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" containerID="57aa263732b323e9ae994e5fec3e55a8b13ef707bf2da626a94b6f6fd611c5cf" exitCode=0 Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.077840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hr6mw" event={"ID":"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994","Type":"ContainerDied","Data":"57aa263732b323e9ae994e5fec3e55a8b13ef707bf2da626a94b6f6fd611c5cf"} Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.086904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"d216a98c35876505355a072c98803a407134fb32da4539f7508bb3a964861101"} Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.087004 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"fffb9fa514799eaad4c87be898e7acda679be28a7e3cbb20a6ed812a1df74818"} Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.368402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491300 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491365 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d878t\" (UniqueName: \"kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491404 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491432 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491451 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491510 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run\") pod \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\" (UID: \"82d4d466-35c9-4fc9-ba1a-951193d2ccac\") " Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.491935 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run" (OuterVolumeSpecName: "var-run") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.492667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.492709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.492773 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.493750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts" (OuterVolumeSpecName: "scripts") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.495298 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t" (OuterVolumeSpecName: "kube-api-access-d878t") pod "82d4d466-35c9-4fc9-ba1a-951193d2ccac" (UID: "82d4d466-35c9-4fc9-ba1a-951193d2ccac"). InnerVolumeSpecName "kube-api-access-d878t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594389 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d878t\" (UniqueName: \"kubernetes.io/projected/82d4d466-35c9-4fc9-ba1a-951193d2ccac-kube-api-access-d878t\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594752 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594763 4795 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/82d4d466-35c9-4fc9-ba1a-951193d2ccac-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594773 4795 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594782 4795 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:55 crc kubenswrapper[4795]: I0310 15:25:55.594790 4795 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/82d4d466-35c9-4fc9-ba1a-951193d2ccac-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.099540 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"9e111f064109efb80595f389cf76bd33e94329a836ac93783b825bad86540fca"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.099589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"20d56e7a74da74bc3e786f8d968386a34cd4b5ec0e49de641d423f790cb37956"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.099604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"ea4312fd73cdc8a09be4910850558f29a363764d649f079cd7c1b4c481b670bc"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.099617 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"eea39f2936f64253dbca97abf1eae738f216e0ac725faeb0411a2436b291b18d"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.099630 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"612389f7-00cb-49cc-9daf-a5e451d6312f","Type":"ContainerStarted","Data":"d25a9011feb3076346ff6bd1fcb56b0b80c27707613db55f3066447fa8cc0932"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.104374 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x9p5v-config-z59q5" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.104371 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x9p5v-config-z59q5" event={"ID":"82d4d466-35c9-4fc9-ba1a-951193d2ccac","Type":"ContainerDied","Data":"ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344"} Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.104512 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba2c18c7009527ff03a075fe403da8a9c6a0b224e18ffe101dc371b1f637c344" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.141882 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.396207571 podStartE2EDuration="45.141864605s" podCreationTimestamp="2026-03-10 15:25:11 +0000 UTC" firstStartedPulling="2026-03-10 15:25:44.83913404 +0000 UTC m=+1178.004874938" lastFinishedPulling="2026-03-10 15:25:54.584791034 +0000 UTC m=+1187.750531972" observedRunningTime="2026-03-10 15:25:56.136887793 +0000 UTC m=+1189.302628691" watchObservedRunningTime="2026-03-10 15:25:56.141864605 +0000 UTC m=+1189.307605503" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.455871 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x9p5v-config-z59q5"] Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.463831 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x9p5v-config-z59q5"] Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.465918 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.499295 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:56 crc kubenswrapper[4795]: E0310 15:25:56.499595 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d4d466-35c9-4fc9-ba1a-951193d2ccac" containerName="ovn-config" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.499606 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d4d466-35c9-4fc9-ba1a-951193d2ccac" containerName="ovn-config" Mar 10 15:25:56 crc kubenswrapper[4795]: E0310 15:25:56.499636 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" containerName="glance-db-sync" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.499645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" containerName="glance-db-sync" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.499802 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d4d466-35c9-4fc9-ba1a-951193d2ccac" containerName="ovn-config" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.499819 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" containerName="glance-db-sync" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.500747 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.504174 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.513223 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609100 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hcvp\" (UniqueName: \"kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp\") pod \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609160 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data\") pod \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data\") pod \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle\") pod \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\" (UID: \"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994\") " Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609598 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609648 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgjf\" (UniqueName: \"kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609693 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609715 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.609750 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.614313 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" (UID: "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.615020 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp" (OuterVolumeSpecName: "kube-api-access-9hcvp") pod "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" (UID: "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994"). InnerVolumeSpecName "kube-api-access-9hcvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.633009 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" (UID: "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.652247 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data" (OuterVolumeSpecName: "config-data") pod "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" (UID: "a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.711964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712013 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712098 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712251 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712297 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgjf\" (UniqueName: \"kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712382 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hcvp\" (UniqueName: \"kubernetes.io/projected/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-kube-api-access-9hcvp\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712515 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712554 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712567 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.712958 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.713128 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.713671 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.713747 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.714458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.730370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgjf\" (UniqueName: \"kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf\") pod \"dnsmasq-dns-5c79d794d7-7wt5n\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:56 crc kubenswrapper[4795]: I0310 15:25:56.817707 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.114355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-hr6mw" event={"ID":"a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994","Type":"ContainerDied","Data":"014d3c3b812f0891893d304c207d897b9a97214dcd77136754e2a623c34fdb44"} Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.114382 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-hr6mw" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.114398 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014d3c3b812f0891893d304c207d897b9a97214dcd77136754e2a623c34fdb44" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.337083 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.361446 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.469489 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.493408 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d4d466-35c9-4fc9-ba1a-951193d2ccac" path="/var/lib/kubelet/pods/82d4d466-35c9-4fc9-ba1a-951193d2ccac/volumes" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.529185 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.536451 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.578798 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655001 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655061 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655094 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655394 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgfz2\" (UniqueName: \"kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.655545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.756856 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.756940 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.757994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.758310 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.758365 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgfz2\" (UniqueName: \"kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.758702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.758731 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.759320 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.759480 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.759632 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.760117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.764824 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-zf2n4"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.766029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.787333 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zf2n4"] Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.813759 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgfz2\" (UniqueName: \"kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2\") pod \"dnsmasq-dns-5f59b8f679-f2x9q\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.862102 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvv9\" (UniqueName: \"kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.862275 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.879735 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.963384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.963516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvv9\" (UniqueName: \"kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.964783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:57 crc kubenswrapper[4795]: I0310 15:25:57.994431 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvv9\" (UniqueName: \"kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9\") pod \"cinder-db-create-zf2n4\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.082408 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zf2n4" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.115834 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-cwpm2"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.141036 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cwpm2"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.141541 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.149760 4795 generic.go:334] "Generic (PLEG): container finished" podID="483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" containerID="62f5e8e1b2bca189f26094c86fa16b17f23e204f0966414c89ea56a0c47c1c42" exitCode=0 Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.149799 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" event={"ID":"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e","Type":"ContainerDied","Data":"62f5e8e1b2bca189f26094c86fa16b17f23e204f0966414c89ea56a0c47c1c42"} Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.149822 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" event={"ID":"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e","Type":"ContainerStarted","Data":"04da18a9ae3ff20a0cb7070ed80d599cba66eb9b8778a1e34ee00dbdc7651233"} Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.166425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf7f5\" (UniqueName: \"kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.166535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.168792 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-664d-account-create-update-f29p6"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.169704 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.182699 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.252914 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-664d-account-create-update-f29p6"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.268089 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.268142 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg7xt\" (UniqueName: \"kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.268182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf7f5\" (UniqueName: \"kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.268231 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.268968 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.321724 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8lgbz"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.322820 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.331802 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-ksntn"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.332851 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.359615 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mn9kr" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.362199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf7f5\" (UniqueName: \"kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5\") pod \"barbican-db-create-cwpm2\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.365823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.365981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.370904 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.370958 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lcnj\" (UniqueName: \"kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371058 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg7xt\" (UniqueName: \"kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371236 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-558vz\" (UniqueName: \"kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371257 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371881 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.371894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8lgbz"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.377311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.400210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ksntn"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.407340 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg7xt\" (UniqueName: \"kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt\") pod \"cinder-664d-account-create-update-f29p6\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.441697 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-d52c-account-create-update-d6hmx"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.442972 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.448815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.469294 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cwpm2" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473111 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lcnj\" (UniqueName: \"kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473209 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blkwx\" (UniqueName: \"kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473319 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-558vz\" (UniqueName: \"kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.473342 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.474272 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.487241 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.488485 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.491661 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.492304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d52c-account-create-update-d6hmx"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.536819 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lcnj\" (UniqueName: \"kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj\") pod \"neutron-db-create-8lgbz\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.539151 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-73be-account-create-update-pwvb8"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.544150 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.548732 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.593026 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-558vz\" (UniqueName: \"kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz\") pod \"keystone-db-sync-ksntn\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.595397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.595616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blkwx\" (UniqueName: \"kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.596215 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73be-account-create-update-pwvb8"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.596562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.620863 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blkwx\" (UniqueName: \"kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx\") pod \"barbican-d52c-account-create-update-d6hmx\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.661556 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8lgbz" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.671780 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksntn" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.685921 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.697574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8jmr\" (UniqueName: \"kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.697661 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.766725 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.799380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.800360 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.801074 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8jmr\" (UniqueName: \"kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.821274 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8jmr\" (UniqueName: \"kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr\") pod \"neutron-73be-account-create-update-pwvb8\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:58 crc kubenswrapper[4795]: I0310 15:25:58.891605 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:59 crc kubenswrapper[4795]: W0310 15:25:59.010627 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c2d650a_a19d_4c82_a3fe_34850d8dbbc5.slice/crio-d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889 WatchSource:0}: Error finding container d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889: Status 404 returned error can't find the container with id d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889 Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014415 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014502 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014609 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014646 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014667 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfgjf\" (UniqueName: \"kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.014718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb\") pod \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\" (UID: \"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e\") " Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.018116 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-zf2n4"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.037739 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf" (OuterVolumeSpecName: "kube-api-access-sfgjf") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "kube-api-access-sfgjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.048120 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config" (OuterVolumeSpecName: "config") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.048219 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.051014 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.052376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.053691 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.058035 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" (UID: "483dfbf5-4601-4e8f-85e4-4b5ea8bce89e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.117474 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.117975 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.117987 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.117998 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.118010 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfgjf\" (UniqueName: \"kubernetes.io/projected/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-kube-api-access-sfgjf\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.118021 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.164165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" event={"ID":"483dfbf5-4601-4e8f-85e4-4b5ea8bce89e","Type":"ContainerDied","Data":"04da18a9ae3ff20a0cb7070ed80d599cba66eb9b8778a1e34ee00dbdc7651233"} Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.164217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-7wt5n" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.164229 4795 scope.go:117] "RemoveContainer" containerID="62f5e8e1b2bca189f26094c86fa16b17f23e204f0966414c89ea56a0c47c1c42" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.166935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zf2n4" event={"ID":"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5","Type":"ContainerStarted","Data":"d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889"} Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.170221 4795 generic.go:334] "Generic (PLEG): container finished" podID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerID="5cf2a7c54e1b8c72b321e93336644b02065ab9ccdd40b337a73d7903681233c3" exitCode=0 Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.170266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" event={"ID":"f1e1d893-1c72-4cea-aa9a-614fc05bd08c","Type":"ContainerDied","Data":"5cf2a7c54e1b8c72b321e93336644b02065ab9ccdd40b337a73d7903681233c3"} Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.170294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" event={"ID":"f1e1d893-1c72-4cea-aa9a-614fc05bd08c","Type":"ContainerStarted","Data":"fc0689864f2b1543d7bcfb93fe4a7903f0ae3e9fb7d0d2e8a576020ddcadbcc8"} Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.265142 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.288215 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-7wt5n"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.300711 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-664d-account-create-update-f29p6"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.317534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-cwpm2"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.325869 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-d52c-account-create-update-d6hmx"] Mar 10 15:25:59 crc kubenswrapper[4795]: W0310 15:25:59.336736 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d2d3011_1ce1_43c4_b058_e2171446b079.slice/crio-39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab WatchSource:0}: Error finding container 39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab: Status 404 returned error can't find the container with id 39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.374152 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8lgbz"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.421545 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-ksntn"] Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.489100 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" path="/var/lib/kubelet/pods/483dfbf5-4601-4e8f-85e4-4b5ea8bce89e/volumes" Mar 10 15:25:59 crc kubenswrapper[4795]: I0310 15:25:59.500169 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-73be-account-create-update-pwvb8"] Mar 10 15:25:59 crc kubenswrapper[4795]: W0310 15:25:59.528445 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod450e20be_0c76_4062_8e10_4a11808a9cca.slice/crio-89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9 WatchSource:0}: Error finding container 89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9: Status 404 returned error can't find the container with id 89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.128785 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552606-pxjj6"] Mar 10 15:26:00 crc kubenswrapper[4795]: E0310 15:26:00.129522 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" containerName="init" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.129537 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" containerName="init" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.129703 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="483dfbf5-4601-4e8f-85e4-4b5ea8bce89e" containerName="init" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.133178 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.135525 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.136111 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.136406 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.139971 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-pxjj6"] Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.179338 4795 generic.go:334] "Generic (PLEG): container finished" podID="c217cb85-4bff-4cc2-a554-8dd436e093b0" containerID="08b9db2fb21ce2d930ad792b63000711fa2bb8a01f209b7ac7a8b780cb053953" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.179372 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8lgbz" event={"ID":"c217cb85-4bff-4cc2-a554-8dd436e093b0","Type":"ContainerDied","Data":"08b9db2fb21ce2d930ad792b63000711fa2bb8a01f209b7ac7a8b780cb053953"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.179413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8lgbz" event={"ID":"c217cb85-4bff-4cc2-a554-8dd436e093b0","Type":"ContainerStarted","Data":"0bb0bbd0a37998831685cc403c0d9689ef96defcab842b99448eaf494bc8777d"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.183103 4795 generic.go:334] "Generic (PLEG): container finished" podID="9d2d3011-1ce1-43c4-b058-e2171446b079" containerID="5ea8414e86e0e61cabdd23bdc8e027670f375ece690a50e7040aa7e0d6e6ab41" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.183165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52c-account-create-update-d6hmx" event={"ID":"9d2d3011-1ce1-43c4-b058-e2171446b079","Type":"ContainerDied","Data":"5ea8414e86e0e61cabdd23bdc8e027670f375ece690a50e7040aa7e0d6e6ab41"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.183198 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52c-account-create-update-d6hmx" event={"ID":"9d2d3011-1ce1-43c4-b058-e2171446b079","Type":"ContainerStarted","Data":"39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.188501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" event={"ID":"f1e1d893-1c72-4cea-aa9a-614fc05bd08c","Type":"ContainerStarted","Data":"a12932f70a4f14d304c8d96c2d950d003db286688eb85d5c5b8e8e0c297fb459"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.189307 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.191157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksntn" event={"ID":"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5","Type":"ContainerStarted","Data":"109d05df610ce9bd83bd061f81ee2d70ad424a11b9960fbccb72700e03be5e9f"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.193213 4795 generic.go:334] "Generic (PLEG): container finished" podID="325f8d75-2010-44ed-a01a-954670df7e15" containerID="f7460ed6807bd1db5c1d96184e08d131859e20f5d1c897feef35c304b1182395" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.193270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cwpm2" event={"ID":"325f8d75-2010-44ed-a01a-954670df7e15","Type":"ContainerDied","Data":"f7460ed6807bd1db5c1d96184e08d131859e20f5d1c897feef35c304b1182395"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.193291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cwpm2" event={"ID":"325f8d75-2010-44ed-a01a-954670df7e15","Type":"ContainerStarted","Data":"24b3d5c43babf537161a3c5f3665e2e9385a89fabb7e2475b4c33af3c6181461"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.201178 4795 generic.go:334] "Generic (PLEG): container finished" podID="859fb054-ddc6-430b-b049-0571c3c57be3" containerID="5672b8da337c8b685d6738d823f5cc15264092331dea35e99cd89ddfd277abcf" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.201255 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-664d-account-create-update-f29p6" event={"ID":"859fb054-ddc6-430b-b049-0571c3c57be3","Type":"ContainerDied","Data":"5672b8da337c8b685d6738d823f5cc15264092331dea35e99cd89ddfd277abcf"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.201286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-664d-account-create-update-f29p6" event={"ID":"859fb054-ddc6-430b-b049-0571c3c57be3","Type":"ContainerStarted","Data":"fae50489ac007587f0b095a158050cd2519046f8a3325df3b5cdd021e8ec49a1"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.204945 4795 generic.go:334] "Generic (PLEG): container finished" podID="450e20be-0c76-4062-8e10-4a11808a9cca" containerID="c4fc693cc16df99231070db18f9638dcba4cd78e806c3601418f73c2e0dbed82" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.205114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73be-account-create-update-pwvb8" event={"ID":"450e20be-0c76-4062-8e10-4a11808a9cca","Type":"ContainerDied","Data":"c4fc693cc16df99231070db18f9638dcba4cd78e806c3601418f73c2e0dbed82"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.205159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73be-account-create-update-pwvb8" event={"ID":"450e20be-0c76-4062-8e10-4a11808a9cca","Type":"ContainerStarted","Data":"89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.206853 4795 generic.go:334] "Generic (PLEG): container finished" podID="6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" containerID="37f7be6671ae9a44c0e7b4d21139278390629316894e34a28da6559207ece2e3" exitCode=0 Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.206907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zf2n4" event={"ID":"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5","Type":"ContainerDied","Data":"37f7be6671ae9a44c0e7b4d21139278390629316894e34a28da6559207ece2e3"} Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.222892 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" podStartSLOduration=3.222875142 podStartE2EDuration="3.222875142s" podCreationTimestamp="2026-03-10 15:25:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:00.217979522 +0000 UTC m=+1193.383720420" watchObservedRunningTime="2026-03-10 15:26:00.222875142 +0000 UTC m=+1193.388616040" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.234105 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qbvm\" (UniqueName: \"kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm\") pod \"auto-csr-approver-29552606-pxjj6\" (UID: \"f110d7f5-06c8-4211-b21e-7a52e6b694b7\") " pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.335847 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qbvm\" (UniqueName: \"kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm\") pod \"auto-csr-approver-29552606-pxjj6\" (UID: \"f110d7f5-06c8-4211-b21e-7a52e6b694b7\") " pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.356368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qbvm\" (UniqueName: \"kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm\") pod \"auto-csr-approver-29552606-pxjj6\" (UID: \"f110d7f5-06c8-4211-b21e-7a52e6b694b7\") " pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.452413 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:00 crc kubenswrapper[4795]: I0310 15:26:00.932811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-pxjj6"] Mar 10 15:26:00 crc kubenswrapper[4795]: W0310 15:26:00.959576 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf110d7f5_06c8_4211_b21e_7a52e6b694b7.slice/crio-c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d WatchSource:0}: Error finding container c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d: Status 404 returned error can't find the container with id c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.216008 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" event={"ID":"f110d7f5-06c8-4211-b21e-7a52e6b694b7","Type":"ContainerStarted","Data":"c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d"} Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.602350 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.762869 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts\") pod \"450e20be-0c76-4062-8e10-4a11808a9cca\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.762981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8jmr\" (UniqueName: \"kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr\") pod \"450e20be-0c76-4062-8e10-4a11808a9cca\" (UID: \"450e20be-0c76-4062-8e10-4a11808a9cca\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.764151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "450e20be-0c76-4062-8e10-4a11808a9cca" (UID: "450e20be-0c76-4062-8e10-4a11808a9cca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.775769 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr" (OuterVolumeSpecName: "kube-api-access-v8jmr") pod "450e20be-0c76-4062-8e10-4a11808a9cca" (UID: "450e20be-0c76-4062-8e10-4a11808a9cca"). InnerVolumeSpecName "kube-api-access-v8jmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.789409 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8lgbz" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.803912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.809667 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.815035 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zf2n4" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.828058 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cwpm2" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.864652 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/450e20be-0c76-4062-8e10-4a11808a9cca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.864688 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8jmr\" (UniqueName: \"kubernetes.io/projected/450e20be-0c76-4062-8e10-4a11808a9cca-kube-api-access-v8jmr\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.965864 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lcnj\" (UniqueName: \"kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj\") pod \"c217cb85-4bff-4cc2-a554-8dd436e093b0\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.965944 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts\") pod \"9d2d3011-1ce1-43c4-b058-e2171446b079\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.965962 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkvv9\" (UniqueName: \"kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9\") pod \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.965982 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blkwx\" (UniqueName: \"kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx\") pod \"9d2d3011-1ce1-43c4-b058-e2171446b079\" (UID: \"9d2d3011-1ce1-43c4-b058-e2171446b079\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966024 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts\") pod \"859fb054-ddc6-430b-b049-0571c3c57be3\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966118 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg7xt\" (UniqueName: \"kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt\") pod \"859fb054-ddc6-430b-b049-0571c3c57be3\" (UID: \"859fb054-ddc6-430b-b049-0571c3c57be3\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966140 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts\") pod \"325f8d75-2010-44ed-a01a-954670df7e15\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966161 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf7f5\" (UniqueName: \"kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5\") pod \"325f8d75-2010-44ed-a01a-954670df7e15\" (UID: \"325f8d75-2010-44ed-a01a-954670df7e15\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966216 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts\") pod \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\" (UID: \"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966248 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts\") pod \"c217cb85-4bff-4cc2-a554-8dd436e093b0\" (UID: \"c217cb85-4bff-4cc2-a554-8dd436e093b0\") " Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966620 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d2d3011-1ce1-43c4-b058-e2171446b079" (UID: "9d2d3011-1ce1-43c4-b058-e2171446b079"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.966985 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c217cb85-4bff-4cc2-a554-8dd436e093b0" (UID: "c217cb85-4bff-4cc2-a554-8dd436e093b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.967547 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "325f8d75-2010-44ed-a01a-954670df7e15" (UID: "325f8d75-2010-44ed-a01a-954670df7e15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.967589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "859fb054-ddc6-430b-b049-0571c3c57be3" (UID: "859fb054-ddc6-430b-b049-0571c3c57be3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.967610 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" (UID: "6c2d650a-a19d-4c82-a3fe-34850d8dbbc5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.969919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj" (OuterVolumeSpecName: "kube-api-access-6lcnj") pod "c217cb85-4bff-4cc2-a554-8dd436e093b0" (UID: "c217cb85-4bff-4cc2-a554-8dd436e093b0"). InnerVolumeSpecName "kube-api-access-6lcnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.970370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5" (OuterVolumeSpecName: "kube-api-access-nf7f5") pod "325f8d75-2010-44ed-a01a-954670df7e15" (UID: "325f8d75-2010-44ed-a01a-954670df7e15"). InnerVolumeSpecName "kube-api-access-nf7f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.970687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx" (OuterVolumeSpecName: "kube-api-access-blkwx") pod "9d2d3011-1ce1-43c4-b058-e2171446b079" (UID: "9d2d3011-1ce1-43c4-b058-e2171446b079"). InnerVolumeSpecName "kube-api-access-blkwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.972725 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt" (OuterVolumeSpecName: "kube-api-access-vg7xt") pod "859fb054-ddc6-430b-b049-0571c3c57be3" (UID: "859fb054-ddc6-430b-b049-0571c3c57be3"). InnerVolumeSpecName "kube-api-access-vg7xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:01 crc kubenswrapper[4795]: I0310 15:26:01.974207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9" (OuterVolumeSpecName: "kube-api-access-wkvv9") pod "6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" (UID: "6c2d650a-a19d-4c82-a3fe-34850d8dbbc5"). InnerVolumeSpecName "kube-api-access-wkvv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068057 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/859fb054-ddc6-430b-b049-0571c3c57be3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068404 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg7xt\" (UniqueName: \"kubernetes.io/projected/859fb054-ddc6-430b-b049-0571c3c57be3-kube-api-access-vg7xt\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068419 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/325f8d75-2010-44ed-a01a-954670df7e15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068429 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf7f5\" (UniqueName: \"kubernetes.io/projected/325f8d75-2010-44ed-a01a-954670df7e15-kube-api-access-nf7f5\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068439 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068447 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c217cb85-4bff-4cc2-a554-8dd436e093b0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068456 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lcnj\" (UniqueName: \"kubernetes.io/projected/c217cb85-4bff-4cc2-a554-8dd436e093b0-kube-api-access-6lcnj\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068464 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkvv9\" (UniqueName: \"kubernetes.io/projected/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5-kube-api-access-wkvv9\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068473 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d2d3011-1ce1-43c4-b058-e2171446b079-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.068481 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blkwx\" (UniqueName: \"kubernetes.io/projected/9d2d3011-1ce1-43c4-b058-e2171446b079-kube-api-access-blkwx\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.226579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-73be-account-create-update-pwvb8" event={"ID":"450e20be-0c76-4062-8e10-4a11808a9cca","Type":"ContainerDied","Data":"89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.226645 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89e24b00a1aca05095c3ab152fe15d278c4e3866ea3affdb8ad28e3ed872edb9" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.226600 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-73be-account-create-update-pwvb8" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.228406 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-zf2n4" event={"ID":"6c2d650a-a19d-4c82-a3fe-34850d8dbbc5","Type":"ContainerDied","Data":"d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.228454 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2d29750c933441512cfab67649a0bd094cd626c936872a20c6ecaa860dd4889" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.228451 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-zf2n4" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.231407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-cwpm2" event={"ID":"325f8d75-2010-44ed-a01a-954670df7e15","Type":"ContainerDied","Data":"24b3d5c43babf537161a3c5f3665e2e9385a89fabb7e2475b4c33af3c6181461"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.231449 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b3d5c43babf537161a3c5f3665e2e9385a89fabb7e2475b4c33af3c6181461" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.231506 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-cwpm2" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.235652 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-664d-account-create-update-f29p6" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.235649 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-664d-account-create-update-f29p6" event={"ID":"859fb054-ddc6-430b-b049-0571c3c57be3","Type":"ContainerDied","Data":"fae50489ac007587f0b095a158050cd2519046f8a3325df3b5cdd021e8ec49a1"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.235771 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fae50489ac007587f0b095a158050cd2519046f8a3325df3b5cdd021e8ec49a1" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.248019 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8lgbz" event={"ID":"c217cb85-4bff-4cc2-a554-8dd436e093b0","Type":"ContainerDied","Data":"0bb0bbd0a37998831685cc403c0d9689ef96defcab842b99448eaf494bc8777d"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.248113 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bb0bbd0a37998831685cc403c0d9689ef96defcab842b99448eaf494bc8777d" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.248218 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8lgbz" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.254190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-d52c-account-create-update-d6hmx" event={"ID":"9d2d3011-1ce1-43c4-b058-e2171446b079","Type":"ContainerDied","Data":"39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab"} Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.254234 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e113b29ae96afb58c3444f97dcd644b7cbe5de4e73c5fe13e15f474d53f4ab" Mar 10 15:26:02 crc kubenswrapper[4795]: I0310 15:26:02.254965 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-d52c-account-create-update-d6hmx" Mar 10 15:26:06 crc kubenswrapper[4795]: I0310 15:26:06.288475 4795 generic.go:334] "Generic (PLEG): container finished" podID="f110d7f5-06c8-4211-b21e-7a52e6b694b7" containerID="dcdd2a66ae19a75a61ddb0c935ce0e52157cfcda7bd2a89358b39260b0cc667d" exitCode=0 Mar 10 15:26:06 crc kubenswrapper[4795]: I0310 15:26:06.288557 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" event={"ID":"f110d7f5-06c8-4211-b21e-7a52e6b694b7","Type":"ContainerDied","Data":"dcdd2a66ae19a75a61ddb0c935ce0e52157cfcda7bd2a89358b39260b0cc667d"} Mar 10 15:26:06 crc kubenswrapper[4795]: I0310 15:26:06.292509 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksntn" event={"ID":"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5","Type":"ContainerStarted","Data":"94143bbcaa5dbca085364e324ff835b43fdff929594d46a37cee4aa3fa2dce9d"} Mar 10 15:26:06 crc kubenswrapper[4795]: I0310 15:26:06.329364 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-ksntn" podStartSLOduration=2.080862232 podStartE2EDuration="8.329338622s" podCreationTimestamp="2026-03-10 15:25:58 +0000 UTC" firstStartedPulling="2026-03-10 15:25:59.40334972 +0000 UTC m=+1192.569090618" lastFinishedPulling="2026-03-10 15:26:05.65182611 +0000 UTC m=+1198.817567008" observedRunningTime="2026-03-10 15:26:06.317490363 +0000 UTC m=+1199.483231271" watchObservedRunningTime="2026-03-10 15:26:06.329338622 +0000 UTC m=+1199.495079560" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.083908 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.609505 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.769457 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qbvm\" (UniqueName: \"kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm\") pod \"f110d7f5-06c8-4211-b21e-7a52e6b694b7\" (UID: \"f110d7f5-06c8-4211-b21e-7a52e6b694b7\") " Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.774709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm" (OuterVolumeSpecName: "kube-api-access-2qbvm") pod "f110d7f5-06c8-4211-b21e-7a52e6b694b7" (UID: "f110d7f5-06c8-4211-b21e-7a52e6b694b7"). InnerVolumeSpecName "kube-api-access-2qbvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.871461 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qbvm\" (UniqueName: \"kubernetes.io/projected/f110d7f5-06c8-4211-b21e-7a52e6b694b7-kube-api-access-2qbvm\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.881267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.966468 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:26:07 crc kubenswrapper[4795]: I0310 15:26:07.967961 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="dnsmasq-dns" containerID="cri-o://c53b8d09e84e7e4cdb96ebd3e0ade02887c0bce0356a6a5e016382d13f4fdfe5" gracePeriod=10 Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.368682 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" event={"ID":"f110d7f5-06c8-4211-b21e-7a52e6b694b7","Type":"ContainerDied","Data":"c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d"} Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.368726 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35a456c167f2ec9d94f7e807c1f0473cc8194dd3e7a54695671f8f9b3b7f82d" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.368796 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552606-pxjj6" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.379522 4795 generic.go:334] "Generic (PLEG): container finished" podID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerID="c53b8d09e84e7e4cdb96ebd3e0ade02887c0bce0356a6a5e016382d13f4fdfe5" exitCode=0 Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.379577 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" event={"ID":"e2b540d3-b27e-4bf6-922c-89fef3c66955","Type":"ContainerDied","Data":"c53b8d09e84e7e4cdb96ebd3e0ade02887c0bce0356a6a5e016382d13f4fdfe5"} Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.656443 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.669562 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-mcxq8"] Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.677220 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552600-mcxq8"] Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.792649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config\") pod \"e2b540d3-b27e-4bf6-922c-89fef3c66955\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.792765 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb59l\" (UniqueName: \"kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l\") pod \"e2b540d3-b27e-4bf6-922c-89fef3c66955\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.792836 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc\") pod \"e2b540d3-b27e-4bf6-922c-89fef3c66955\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.792892 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb\") pod \"e2b540d3-b27e-4bf6-922c-89fef3c66955\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.793009 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb\") pod \"e2b540d3-b27e-4bf6-922c-89fef3c66955\" (UID: \"e2b540d3-b27e-4bf6-922c-89fef3c66955\") " Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.797301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l" (OuterVolumeSpecName: "kube-api-access-cb59l") pod "e2b540d3-b27e-4bf6-922c-89fef3c66955" (UID: "e2b540d3-b27e-4bf6-922c-89fef3c66955"). InnerVolumeSpecName "kube-api-access-cb59l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.838151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e2b540d3-b27e-4bf6-922c-89fef3c66955" (UID: "e2b540d3-b27e-4bf6-922c-89fef3c66955"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.838637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config" (OuterVolumeSpecName: "config") pod "e2b540d3-b27e-4bf6-922c-89fef3c66955" (UID: "e2b540d3-b27e-4bf6-922c-89fef3c66955"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.847628 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e2b540d3-b27e-4bf6-922c-89fef3c66955" (UID: "e2b540d3-b27e-4bf6-922c-89fef3c66955"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.847824 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e2b540d3-b27e-4bf6-922c-89fef3c66955" (UID: "e2b540d3-b27e-4bf6-922c-89fef3c66955"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.894175 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.894204 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.894216 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.894224 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e2b540d3-b27e-4bf6-922c-89fef3c66955-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:08 crc kubenswrapper[4795]: I0310 15:26:08.894234 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb59l\" (UniqueName: \"kubernetes.io/projected/e2b540d3-b27e-4bf6-922c-89fef3c66955-kube-api-access-cb59l\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.392838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" event={"ID":"e2b540d3-b27e-4bf6-922c-89fef3c66955","Type":"ContainerDied","Data":"47fb011077be883cc729d7e19cb532aecab6134e5708bda4cc43d8e41855fbdd"} Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.393166 4795 scope.go:117] "RemoveContainer" containerID="c53b8d09e84e7e4cdb96ebd3e0ade02887c0bce0356a6a5e016382d13f4fdfe5" Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.393194 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.396523 4795 generic.go:334] "Generic (PLEG): container finished" podID="948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" containerID="94143bbcaa5dbca085364e324ff835b43fdff929594d46a37cee4aa3fa2dce9d" exitCode=0 Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.396608 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksntn" event={"ID":"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5","Type":"ContainerDied","Data":"94143bbcaa5dbca085364e324ff835b43fdff929594d46a37cee4aa3fa2dce9d"} Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.435460 4795 scope.go:117] "RemoveContainer" containerID="fe1b8ecbb879fe289a13b411e19fdefc6599d9ca3d770da7d54458ae0396b35e" Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.459326 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.470341 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-p4wdl"] Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.489845 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73528fe5-4e4b-4ba5-b30e-e781e1f8d12e" path="/var/lib/kubelet/pods/73528fe5-4e4b-4ba5-b30e-e781e1f8d12e/volumes" Mar 10 15:26:09 crc kubenswrapper[4795]: I0310 15:26:09.491215 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" path="/var/lib/kubelet/pods/e2b540d3-b27e-4bf6-922c-89fef3c66955/volumes" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.739972 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksntn" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.828418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data\") pod \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.828553 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-558vz\" (UniqueName: \"kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz\") pod \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.828628 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle\") pod \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\" (UID: \"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5\") " Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.834003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz" (OuterVolumeSpecName: "kube-api-access-558vz") pod "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" (UID: "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5"). InnerVolumeSpecName "kube-api-access-558vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.854661 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" (UID: "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.874125 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data" (OuterVolumeSpecName: "config-data") pod "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" (UID: "948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.930651 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.930714 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-558vz\" (UniqueName: \"kubernetes.io/projected/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-kube-api-access-558vz\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:10 crc kubenswrapper[4795]: I0310 15:26:10.930746 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.424765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-ksntn" event={"ID":"948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5","Type":"ContainerDied","Data":"109d05df610ce9bd83bd061f81ee2d70ad424a11b9960fbccb72700e03be5e9f"} Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.425058 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="109d05df610ce9bd83bd061f81ee2d70ad424a11b9960fbccb72700e03be5e9f" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.425167 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-ksntn" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708266 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708744 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" containerName="keystone-db-sync" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708763 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" containerName="keystone-db-sync" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708773 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="init" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708783 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="init" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708805 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708813 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708825 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c217cb85-4bff-4cc2-a554-8dd436e093b0" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708835 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c217cb85-4bff-4cc2-a554-8dd436e093b0" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708848 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2d3011-1ce1-43c4-b058-e2171446b079" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708855 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2d3011-1ce1-43c4-b058-e2171446b079" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708879 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f110d7f5-06c8-4211-b21e-7a52e6b694b7" containerName="oc" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708888 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f110d7f5-06c8-4211-b21e-7a52e6b694b7" containerName="oc" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708899 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="dnsmasq-dns" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708911 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="dnsmasq-dns" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708939 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="859fb054-ddc6-430b-b049-0571c3c57be3" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708947 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="859fb054-ddc6-430b-b049-0571c3c57be3" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708964 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="325f8d75-2010-44ed-a01a-954670df7e15" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.708973 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="325f8d75-2010-44ed-a01a-954670df7e15" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: E0310 15:26:11.708996 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="450e20be-0c76-4062-8e10-4a11808a9cca" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709004 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="450e20be-0c76-4062-8e10-4a11808a9cca" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709205 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c217cb85-4bff-4cc2-a554-8dd436e093b0" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709224 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2d3011-1ce1-43c4-b058-e2171446b079" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709239 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="450e20be-0c76-4062-8e10-4a11808a9cca" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709250 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="dnsmasq-dns" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709266 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="859fb054-ddc6-430b-b049-0571c3c57be3" containerName="mariadb-account-create-update" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709278 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f110d7f5-06c8-4211-b21e-7a52e6b694b7" containerName="oc" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709294 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" containerName="keystone-db-sync" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709305 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="325f8d75-2010-44ed-a01a-954670df7e15" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.709320 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" containerName="mariadb-database-create" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.710422 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.747162 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.759523 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6dkln"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.760797 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.765602 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.765970 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.766176 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.770067 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.770067 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mn9kr" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.796530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6dkln"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851167 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jq2r\" (UniqueName: \"kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851196 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851240 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhhk2\" (UniqueName: \"kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851288 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851363 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851405 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851445 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.851508 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.889629 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.898392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.903422 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-vwnh8" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.903583 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.904049 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.908537 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.946199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952828 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952864 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhhk2\" (UniqueName: \"kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952960 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.952995 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953042 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpsmq\" (UniqueName: \"kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953169 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953189 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jq2r\" (UniqueName: \"kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953203 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953240 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.953999 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.983112 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mg79m"] Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.985707 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.992940 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:11 crc kubenswrapper[4795]: I0310 15:26:11.994965 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.008155 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.008813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.016234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.018668 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.019845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.024543 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.036311 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhhk2\" (UniqueName: \"kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2\") pod \"keystone-bootstrap-6dkln\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.040368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.052699 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056300 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056385 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056409 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056492 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpsmq\" (UniqueName: \"kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056672 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056794 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.056902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw449\" (UniqueName: \"kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.057018 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.057717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.058561 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.061246 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nck2z" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.065177 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.069770 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.114962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jq2r\" (UniqueName: \"kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r\") pod \"dnsmasq-dns-bbf5cc879-sqv7g\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.115316 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.122757 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.123551 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpsmq\" (UniqueName: \"kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq\") pod \"horizon-6846f8f877-gl4gz\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.152034 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-tk68s"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.153229 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.155389 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.155605 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.155849 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v9g98" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.164861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.164902 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.164941 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.164987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.165033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw449\" (UniqueName: \"kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.165049 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.165147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.171147 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.171212 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mg79m"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.173000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.178102 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.179485 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.183138 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tk68s"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.193376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.193840 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.196028 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.200422 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.200670 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.208626 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.209523 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw449\" (UniqueName: \"kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449\") pod \"cinder-db-sync-mg79m\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.215159 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mg79m" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.227532 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-t2rg9"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.233497 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.235484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.237084 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fmcp8" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.237311 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.245133 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.246510 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.257621 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.258368 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.266520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.266664 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.266760 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.267029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.267133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdqbv\" (UniqueName: \"kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.267195 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.267572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.267861 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zqtk\" (UniqueName: \"kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnnz\" (UniqueName: \"kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270521 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270607 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270731 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.270970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.273438 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-t2rg9"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.280824 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-9ktzf"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.282173 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.289408 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-7rp75" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.290140 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.291581 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.311202 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9ktzf"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.352344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.353700 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.359006 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.373948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.373987 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374015 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374090 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374107 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374145 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdqbv\" (UniqueName: \"kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zqtk\" (UniqueName: \"kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374229 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnnz\" (UniqueName: \"kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374262 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374304 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj5ls\" (UniqueName: \"kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374343 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374396 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374438 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374465 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc24m\" (UniqueName: \"kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.374898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.376278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.381438 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.386306 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.398959 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.401124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.402642 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.403892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.406608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.407610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.409395 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.411894 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.414056 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnnz\" (UniqueName: \"kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz\") pod \"placement-db-sync-t2rg9\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.416776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zqtk\" (UniqueName: \"kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk\") pod \"neutron-db-sync-tk68s\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.416845 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.418761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.422028 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdqbv\" (UniqueName: \"kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv\") pod \"ceilometer-0\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.424854 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.425122 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.425285 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxn5s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.426001 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.445052 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476441 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476469 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476511 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc24m\" (UniqueName: \"kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476562 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476599 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476614 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476680 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvcvp\" (UniqueName: \"kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476756 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj5ls\" (UniqueName: \"kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476833 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476870 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkh8\" (UniqueName: \"kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.476906 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.478254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.479887 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.479983 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.482133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.482408 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.483885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.493459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj5ls\" (UniqueName: \"kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls\") pod \"barbican-db-sync-9ktzf\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.493842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc24m\" (UniqueName: \"kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m\") pod \"horizon-7dcccd79bc-8z54g\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.510110 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.513909 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.516231 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.518523 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.530003 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.540761 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.561792 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.577985 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvcvp\" (UniqueName: \"kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578036 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578070 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578159 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578228 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.578429 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkh8\" (UniqueName: \"kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579107 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579152 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579171 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579616 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579644 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579766 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579781 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579817 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579850 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjpdb\" (UniqueName: \"kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579874 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.579891 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.580499 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.580729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.580746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.581404 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.583521 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.583615 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.584729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.585421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.588642 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.596265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvcvp\" (UniqueName: \"kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.600018 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkh8\" (UniqueName: \"kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8\") pod \"dnsmasq-dns-56df8fb6b7-lrlkz\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.622502 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.623310 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.639776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684478 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684527 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684611 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.684705 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjpdb\" (UniqueName: \"kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.685011 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.685882 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.686299 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.689458 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.694672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.696109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.696832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.696844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.701115 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjpdb\" (UniqueName: \"kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.712952 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.744371 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6dkln"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.744942 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: W0310 15:26:12.769852 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72f8600b_d744_4318_a08b_4f0a6da6bb9b.slice/crio-6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea WatchSource:0}: Error finding container 6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea: Status 404 returned error can't find the container with id 6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.846995 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.854945 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.925335 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.943395 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mg79m"] Mar 10 15:26:12 crc kubenswrapper[4795]: I0310 15:26:12.954289 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.422130 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.428514 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-p4wdl" podUID="e2b540d3-b27e-4bf6-922c-89fef3c66955" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.114:5353: i/o timeout" Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.461708 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-tk68s"] Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.507816 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerStarted","Data":"2933eaf16e635f6b4a61564a93fb463e1ed195ae230c20f21c338a26187d3297"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.507866 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dkln" event={"ID":"72f8600b-d744-4318-a08b-4f0a6da6bb9b","Type":"ContainerStarted","Data":"496a0ce2c98b076ed87eb51f2cf780382ff8c97977b8e27b77d3a409e489d75f"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.507899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dkln" event={"ID":"72f8600b-d744-4318-a08b-4f0a6da6bb9b","Type":"ContainerStarted","Data":"6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.507939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mg79m" event={"ID":"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2","Type":"ContainerStarted","Data":"290c84b4eb2f3f9f70945196eaf23a71165539683f6eaf0a6ed625dd65526e8e"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.507954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" event={"ID":"e6867b14-b00d-4722-8e43-ae8821b9f27d","Type":"ContainerStarted","Data":"9af78df553af02aa5bd73cbbcac9ec12cda4e25854c73769def206ec7114477e"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.522350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tk68s" event={"ID":"07ac8661-08c8-49aa-ae40-cf472895a954","Type":"ContainerStarted","Data":"d087ae3107aef3faba20d9dea30e682158956e3fbb1394ab84673ef416a53646"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.526495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6846f8f877-gl4gz" event={"ID":"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a","Type":"ContainerStarted","Data":"5ac6658198ea8b1e64aad25eef1c9c2925fb8f78ee973de5c716a691f72ad44a"} Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.528815 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6dkln" podStartSLOduration=2.528793484 podStartE2EDuration="2.528793484s" podCreationTimestamp="2026-03-10 15:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:13.5223523 +0000 UTC m=+1206.688093198" watchObservedRunningTime="2026-03-10 15:26:13.528793484 +0000 UTC m=+1206.694534382" Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.566747 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-9ktzf"] Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.573267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.582433 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:13 crc kubenswrapper[4795]: W0310 15:26:13.589712 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3e5de16_defe_4daa_94cc_3d50e3461dbd.slice/crio-f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530 WatchSource:0}: Error finding container f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530: Status 404 returned error can't find the container with id f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530 Mar 10 15:26:13 crc kubenswrapper[4795]: W0310 15:26:13.594954 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f67b33e_b7f7_4f63_9e76_886e5e916da2.slice/crio-458c6ba0162cc1fef29d218f364fd15a3344f36655dc1262f1eb657f608e5c45 WatchSource:0}: Error finding container 458c6ba0162cc1fef29d218f364fd15a3344f36655dc1262f1eb657f608e5c45: Status 404 returned error can't find the container with id 458c6ba0162cc1fef29d218f364fd15a3344f36655dc1262f1eb657f608e5c45 Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.754102 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-t2rg9"] Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.815257 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:13 crc kubenswrapper[4795]: W0310 15:26:13.839761 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1690b419_fdd1_4a50_9a57_1ef4c2a9c49e.slice/crio-e547a89a1e477fc2106107051f9346b74deb8208ebe3c3e851cff6df4ec2f9af WatchSource:0}: Error finding container e547a89a1e477fc2106107051f9346b74deb8208ebe3c3e851cff6df4ec2f9af: Status 404 returned error can't find the container with id e547a89a1e477fc2106107051f9346b74deb8208ebe3c3e851cff6df4ec2f9af Mar 10 15:26:13 crc kubenswrapper[4795]: I0310 15:26:13.920855 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.273165 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.317740 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.349154 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.350845 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.381968 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.424040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.424127 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.424154 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvwmm\" (UniqueName: \"kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.424253 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.424569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.437949 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.454473 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526110 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526140 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvwmm\" (UniqueName: \"kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526221 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.526697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.527978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.530540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.549739 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.563168 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvwmm\" (UniqueName: \"kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm\") pod \"horizon-5cc9fc6c77-knd89\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.607843 4795 generic.go:334] "Generic (PLEG): container finished" podID="bb23750d-b817-443b-8876-e16a7775629f" containerID="a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e" exitCode=0 Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.607900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" event={"ID":"bb23750d-b817-443b-8876-e16a7775629f","Type":"ContainerDied","Data":"a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.607926 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" event={"ID":"bb23750d-b817-443b-8876-e16a7775629f","Type":"ContainerStarted","Data":"76feefe9519e1da5eecbe1f5d3ee153cf077a73763adeb144ac296990942a1dd"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.621869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerStarted","Data":"e547a89a1e477fc2106107051f9346b74deb8208ebe3c3e851cff6df4ec2f9af"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.639848 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9ktzf" event={"ID":"e3e5de16-defe-4daa-94cc-3d50e3461dbd","Type":"ContainerStarted","Data":"f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.647565 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6867b14-b00d-4722-8e43-ae8821b9f27d" containerID="8901a22d148459e8cab8587ce747efeccb5344be9c2119af76fb8cf6295e1235" exitCode=0 Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.647623 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" event={"ID":"e6867b14-b00d-4722-8e43-ae8821b9f27d","Type":"ContainerDied","Data":"8901a22d148459e8cab8587ce747efeccb5344be9c2119af76fb8cf6295e1235"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.652407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tk68s" event={"ID":"07ac8661-08c8-49aa-ae40-cf472895a954","Type":"ContainerStarted","Data":"73e3d665d4e95bccb029e0366456de02da3d428fe7eb1c17ddf7df05d991dd13"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.676942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerStarted","Data":"8e3a58ec33b894c6cf37a4b927105a9266d1c93125854c1a114ab94f09ebad5e"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.707593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t2rg9" event={"ID":"f93d5e00-4866-406a-ad39-c3ab0b2156b0","Type":"ContainerStarted","Data":"27fd645679541b9bd189a858f2d4582821dc36c73f0ff78d7cc7f7164f5c2b48"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.709604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dcccd79bc-8z54g" event={"ID":"9f67b33e-b7f7-4f63-9e76-886e5e916da2","Type":"ContainerStarted","Data":"458c6ba0162cc1fef29d218f364fd15a3344f36655dc1262f1eb657f608e5c45"} Mar 10 15:26:14 crc kubenswrapper[4795]: I0310 15:26:14.717906 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.070041 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.089145 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-tk68s" podStartSLOduration=4.089130518 podStartE2EDuration="4.089130518s" podCreationTimestamp="2026-03-10 15:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:14.710339957 +0000 UTC m=+1207.876080855" watchObservedRunningTime="2026-03-10 15:26:15.089130518 +0000 UTC m=+1208.254871416" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170219 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170330 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170411 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jq2r\" (UniqueName: \"kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170495 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.170638 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0\") pod \"e6867b14-b00d-4722-8e43-ae8821b9f27d\" (UID: \"e6867b14-b00d-4722-8e43-ae8821b9f27d\") " Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.175687 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r" (OuterVolumeSpecName: "kube-api-access-9jq2r") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "kube-api-access-9jq2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.200890 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.209638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.214174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config" (OuterVolumeSpecName: "config") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.214331 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.231945 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e6867b14-b00d-4722-8e43-ae8821b9f27d" (UID: "e6867b14-b00d-4722-8e43-ae8821b9f27d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272178 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272473 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272486 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272498 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272509 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jq2r\" (UniqueName: \"kubernetes.io/projected/e6867b14-b00d-4722-8e43-ae8821b9f27d-kube-api-access-9jq2r\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.272519 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6867b14-b00d-4722-8e43-ae8821b9f27d-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.297022 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:15 crc kubenswrapper[4795]: W0310 15:26:15.306334 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79398470_9b8c_45b8_99c2_90b6a0927da4.slice/crio-736e294b9df55d7270d3c2a594a45779f54fcb79c8f8a131476b3eeb81b615db WatchSource:0}: Error finding container 736e294b9df55d7270d3c2a594a45779f54fcb79c8f8a131476b3eeb81b615db: Status 404 returned error can't find the container with id 736e294b9df55d7270d3c2a594a45779f54fcb79c8f8a131476b3eeb81b615db Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.734188 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc9fc6c77-knd89" event={"ID":"79398470-9b8c-45b8-99c2-90b6a0927da4","Type":"ContainerStarted","Data":"736e294b9df55d7270d3c2a594a45779f54fcb79c8f8a131476b3eeb81b615db"} Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.743963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" event={"ID":"e6867b14-b00d-4722-8e43-ae8821b9f27d","Type":"ContainerDied","Data":"9af78df553af02aa5bd73cbbcac9ec12cda4e25854c73769def206ec7114477e"} Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.744015 4795 scope.go:117] "RemoveContainer" containerID="8901a22d148459e8cab8587ce747efeccb5344be9c2119af76fb8cf6295e1235" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.744175 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-sqv7g" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.750569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerStarted","Data":"c405256780b813c61c4f47ea069dfd731cc2f6405f5169844bf7a4237545d35d"} Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.757803 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" event={"ID":"bb23750d-b817-443b-8876-e16a7775629f","Type":"ContainerStarted","Data":"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a"} Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.758541 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.761379 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerStarted","Data":"e28021df84b69443e74e2db18d55c6154ebe19ca88de4665c973f2fcbe4449ed"} Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.795931 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.814642 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-sqv7g"] Mar 10 15:26:15 crc kubenswrapper[4795]: I0310 15:26:15.814966 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" podStartSLOduration=3.814952001 podStartE2EDuration="3.814952001s" podCreationTimestamp="2026-03-10 15:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:15.798663935 +0000 UTC m=+1208.964404873" watchObservedRunningTime="2026-03-10 15:26:15.814952001 +0000 UTC m=+1208.980692889" Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.801669 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerStarted","Data":"16a729f72fc0271e64c35902563148df00d1d2da53c0acd6891dc03dbb6cf59e"} Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.801773 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-log" containerID="cri-o://e28021df84b69443e74e2db18d55c6154ebe19ca88de4665c973f2fcbe4449ed" gracePeriod=30 Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.802118 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-httpd" containerID="cri-o://16a729f72fc0271e64c35902563148df00d1d2da53c0acd6891dc03dbb6cf59e" gracePeriod=30 Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.821620 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.821603124 podStartE2EDuration="4.821603124s" podCreationTimestamp="2026-03-10 15:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:16.820813162 +0000 UTC m=+1209.986554060" watchObservedRunningTime="2026-03-10 15:26:16.821603124 +0000 UTC m=+1209.987344022" Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.824338 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-log" containerID="cri-o://c405256780b813c61c4f47ea069dfd731cc2f6405f5169844bf7a4237545d35d" gracePeriod=30 Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.824679 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerStarted","Data":"8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175"} Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.824784 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-httpd" containerID="cri-o://8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175" gracePeriod=30 Mar 10 15:26:16 crc kubenswrapper[4795]: I0310 15:26:16.843358 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.843340386 podStartE2EDuration="4.843340386s" podCreationTimestamp="2026-03-10 15:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:16.840567206 +0000 UTC m=+1210.006308124" watchObservedRunningTime="2026-03-10 15:26:16.843340386 +0000 UTC m=+1210.009081284" Mar 10 15:26:17 crc kubenswrapper[4795]: E0310 15:26:17.150555 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52cb80dd_dcc6_477f_83d4_810da076df4f.slice/crio-conmon-8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175.scope\": RecentStats: unable to find data in memory cache]" Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.491750 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6867b14-b00d-4722-8e43-ae8821b9f27d" path="/var/lib/kubelet/pods/e6867b14-b00d-4722-8e43-ae8821b9f27d/volumes" Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.838405 4795 generic.go:334] "Generic (PLEG): container finished" podID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerID="8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175" exitCode=0 Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.838445 4795 generic.go:334] "Generic (PLEG): container finished" podID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerID="c405256780b813c61c4f47ea069dfd731cc2f6405f5169844bf7a4237545d35d" exitCode=143 Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.838492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerDied","Data":"8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175"} Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.838548 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerDied","Data":"c405256780b813c61c4f47ea069dfd731cc2f6405f5169844bf7a4237545d35d"} Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.840969 4795 generic.go:334] "Generic (PLEG): container finished" podID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerID="16a729f72fc0271e64c35902563148df00d1d2da53c0acd6891dc03dbb6cf59e" exitCode=0 Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.840990 4795 generic.go:334] "Generic (PLEG): container finished" podID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerID="e28021df84b69443e74e2db18d55c6154ebe19ca88de4665c973f2fcbe4449ed" exitCode=143 Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.841059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerDied","Data":"16a729f72fc0271e64c35902563148df00d1d2da53c0acd6891dc03dbb6cf59e"} Mar 10 15:26:17 crc kubenswrapper[4795]: I0310 15:26:17.841133 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerDied","Data":"e28021df84b69443e74e2db18d55c6154ebe19ca88de4665c973f2fcbe4449ed"} Mar 10 15:26:18 crc kubenswrapper[4795]: I0310 15:26:18.539421 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:26:18 crc kubenswrapper[4795]: I0310 15:26:18.539903 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:26:18 crc kubenswrapper[4795]: I0310 15:26:18.849386 4795 generic.go:334] "Generic (PLEG): container finished" podID="72f8600b-d744-4318-a08b-4f0a6da6bb9b" containerID="496a0ce2c98b076ed87eb51f2cf780382ff8c97977b8e27b77d3a409e489d75f" exitCode=0 Mar 10 15:26:18 crc kubenswrapper[4795]: I0310 15:26:18.849410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dkln" event={"ID":"72f8600b-d744-4318-a08b-4f0a6da6bb9b","Type":"ContainerDied","Data":"496a0ce2c98b076ed87eb51f2cf780382ff8c97977b8e27b77d3a409e489d75f"} Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.097519 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.136402 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:26:21 crc kubenswrapper[4795]: E0310 15:26:21.136731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6867b14-b00d-4722-8e43-ae8821b9f27d" containerName="init" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.136743 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6867b14-b00d-4722-8e43-ae8821b9f27d" containerName="init" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.136902 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6867b14-b00d-4722-8e43-ae8821b9f27d" containerName="init" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.138340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.140706 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.173794 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.206837 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.213959 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.213998 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.214025 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.214103 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.214129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsf2c\" (UniqueName: \"kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.214155 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.214179 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.220053 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5869d54dfb-2wjww"] Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.231274 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.241190 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5869d54dfb-2wjww"] Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.315932 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.315964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.315993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316029 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-tls-certs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316124 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-secret-key\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316166 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsf2c\" (UniqueName: \"kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316220 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316241 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8nw9\" (UniqueName: \"kubernetes.io/projected/379541ea-de81-488c-b6dc-2f5873fdfbeb-kube-api-access-n8nw9\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316286 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/379541ea-de81-488c-b6dc-2f5873fdfbeb-logs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316324 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-scripts\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-config-data\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.316372 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-combined-ca-bundle\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.317008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.317332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.317586 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.321466 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.323427 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.332475 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.332539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsf2c\" (UniqueName: \"kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c\") pod \"horizon-67547556b6-45876\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.417392 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-secret-key\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.417496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8nw9\" (UniqueName: \"kubernetes.io/projected/379541ea-de81-488c-b6dc-2f5873fdfbeb-kube-api-access-n8nw9\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.417526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/379541ea-de81-488c-b6dc-2f5873fdfbeb-logs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.417843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-scripts\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.418102 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-config-data\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.418147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-combined-ca-bundle\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.418312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/379541ea-de81-488c-b6dc-2f5873fdfbeb-logs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.418655 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-scripts\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.418702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-tls-certs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.419490 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/379541ea-de81-488c-b6dc-2f5873fdfbeb-config-data\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.421976 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-tls-certs\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.422278 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-combined-ca-bundle\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.422283 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/379541ea-de81-488c-b6dc-2f5873fdfbeb-horizon-secret-key\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.432694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8nw9\" (UniqueName: \"kubernetes.io/projected/379541ea-de81-488c-b6dc-2f5873fdfbeb-kube-api-access-n8nw9\") pod \"horizon-5869d54dfb-2wjww\" (UID: \"379541ea-de81-488c-b6dc-2f5873fdfbeb\") " pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.470166 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:21 crc kubenswrapper[4795]: I0310 15:26:21.552303 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.688326 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.752390 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.752855 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" containerID="cri-o://a12932f70a4f14d304c8d96c2d950d003db286688eb85d5c5b8e8e0c297fb459" gracePeriod=10 Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.884323 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.898047 4795 generic.go:334] "Generic (PLEG): container finished" podID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerID="a12932f70a4f14d304c8d96c2d950d003db286688eb85d5c5b8e8e0c297fb459" exitCode=0 Mar 10 15:26:22 crc kubenswrapper[4795]: I0310 15:26:22.898118 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" event={"ID":"f1e1d893-1c72-4cea-aa9a-614fc05bd08c","Type":"ContainerDied","Data":"a12932f70a4f14d304c8d96c2d950d003db286688eb85d5c5b8e8e0c297fb459"} Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.750747 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.752094 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54dh99h5c8h689h65fhc7h547h8h645h9h564hb8h67ch58h59bhc6h546h5bch86hb8h688h95h65bh56bhcfh64ch56dh588hb8h59h579h567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gc24m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7dcccd79bc-8z54g_openstack(9f67b33e-b7f7-4f63-9e76-886e5e916da2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.754249 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7dcccd79bc-8z54g" podUID="9f67b33e-b7f7-4f63-9e76-886e5e916da2" Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.762874 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.763024 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n89h579hb8hc9h5dbhd9h54dhc7h545h59fh5c7h554hd9h674h58fhcdhcch67dh684h665h644h59dh5dh569h95h57dh66h4h599h654h4h74q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dpsmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6846f8f877-gl4gz_openstack(8c5fe7dd-380e-4ae8-baa5-3a518e340d3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:27 crc kubenswrapper[4795]: E0310 15:26:27.765207 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6846f8f877-gl4gz" podUID="8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" Mar 10 15:26:27 crc kubenswrapper[4795]: I0310 15:26:27.881049 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: connect: connection refused" Mar 10 15:26:29 crc kubenswrapper[4795]: E0310 15:26:29.335593 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 10 15:26:29 crc kubenswrapper[4795]: E0310 15:26:29.336046 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n9h84h59ch5cbh67h575h65ch657h59bh59bh576h75hfbh687h67bh5d8h644h69h5c7h5f4h55h5c6hch7dh6bh5bch54h9h677h86h547h597q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wvwmm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-5cc9fc6c77-knd89_openstack(79398470-9b8c-45b8-99c2-90b6a0927da4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:29 crc kubenswrapper[4795]: E0310 15:26:29.355682 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-5cc9fc6c77-knd89" podUID="79398470-9b8c-45b8-99c2-90b6a0927da4" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.454734 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.486300 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.487687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588281 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjpdb\" (UniqueName: \"kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588368 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588420 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588489 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588522 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588545 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhhk2\" (UniqueName: \"kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588615 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588681 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvcvp\" (UniqueName: \"kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588881 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588931 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys\") pod \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\" (UID: \"72f8600b-d744-4318-a08b-4f0a6da6bb9b\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588952 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data\") pod \"52cb80dd-dcc6-477f-83d4-810da076df4f\" (UID: \"52cb80dd-dcc6-477f-83d4-810da076df4f\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.588979 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs\") pod \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\" (UID: \"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e\") " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.589659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs" (OuterVolumeSpecName: "logs") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.589796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.590336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs" (OuterVolumeSpecName: "logs") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.594103 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.595101 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb" (OuterVolumeSpecName: "kube-api-access-jjpdb") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "kube-api-access-jjpdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.595453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.595463 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp" (OuterVolumeSpecName: "kube-api-access-vvcvp") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "kube-api-access-vvcvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.597364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2" (OuterVolumeSpecName: "kube-api-access-lhhk2") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "kube-api-access-lhhk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.597789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts" (OuterVolumeSpecName: "scripts") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.597971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.598181 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts" (OuterVolumeSpecName: "scripts") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.598849 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts" (OuterVolumeSpecName: "scripts") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.599801 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.600187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.632744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data" (OuterVolumeSpecName: "config-data") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.633709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.634606 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72f8600b-d744-4318-a08b-4f0a6da6bb9b" (UID: "72f8600b-d744-4318-a08b-4f0a6da6bb9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.649560 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.652122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.652417 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.661971 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data" (OuterVolumeSpecName: "config-data") pod "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" (UID: "1690b419-fdd1-4a50-9a57-1ef4c2a9c49e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.662191 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data" (OuterVolumeSpecName: "config-data") pod "52cb80dd-dcc6-477f-83d4-810da076df4f" (UID: "52cb80dd-dcc6-477f-83d4-810da076df4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691725 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvcvp\" (UniqueName: \"kubernetes.io/projected/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-kube-api-access-vvcvp\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691759 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691772 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691783 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691795 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691806 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691816 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691827 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjpdb\" (UniqueName: \"kubernetes.io/projected/52cb80dd-dcc6-477f-83d4-810da076df4f-kube-api-access-jjpdb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691838 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691853 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691865 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691875 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691911 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691927 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691938 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691949 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhhk2\" (UniqueName: \"kubernetes.io/projected/72f8600b-d744-4318-a08b-4f0a6da6bb9b-kube-api-access-lhhk2\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.691989 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52cb80dd-dcc6-477f-83d4-810da076df4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.692004 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.692014 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.692093 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.692108 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/72f8600b-d744-4318-a08b-4f0a6da6bb9b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.692121 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/52cb80dd-dcc6-477f-83d4-810da076df4f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.720239 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.723440 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.794020 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.794052 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.957666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"52cb80dd-dcc6-477f-83d4-810da076df4f","Type":"ContainerDied","Data":"8e3a58ec33b894c6cf37a4b927105a9266d1c93125854c1a114ab94f09ebad5e"} Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.957724 4795 scope.go:117] "RemoveContainer" containerID="8c3d3c652d6e009f8d2508749dced2dbc045e64d8392b11ae60b3d432ac50175" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.957863 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.970109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1690b419-fdd1-4a50-9a57-1ef4c2a9c49e","Type":"ContainerDied","Data":"e547a89a1e477fc2106107051f9346b74deb8208ebe3c3e851cff6df4ec2f9af"} Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.970219 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.977416 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6dkln" Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.977875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6dkln" event={"ID":"72f8600b-d744-4318-a08b-4f0a6da6bb9b","Type":"ContainerDied","Data":"6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea"} Mar 10 15:26:29 crc kubenswrapper[4795]: I0310 15:26:29.977915 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b03268b43a520a77b884c74248a9e3fb66b5b97b4909e27e0c6e54eacc3d0ea" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.019501 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.031407 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.040189 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.046604 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.056433 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: E0310 15:26:30.056866 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.056890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: E0310 15:26:30.056907 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.056914 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: E0310 15:26:30.056942 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.056950 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: E0310 15:26:30.056971 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f8600b-d744-4318-a08b-4f0a6da6bb9b" containerName="keystone-bootstrap" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.056979 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f8600b-d744-4318-a08b-4f0a6da6bb9b" containerName="keystone-bootstrap" Mar 10 15:26:30 crc kubenswrapper[4795]: E0310 15:26:30.056993 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057000 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057241 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057269 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057284 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" containerName="glance-log" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057302 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f8600b-d744-4318-a08b-4f0a6da6bb9b" containerName="keystone-bootstrap" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.057312 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" containerName="glance-httpd" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.058473 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.062023 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxn5s" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.062867 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.063128 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.065366 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.071275 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.071335 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.076739 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.079250 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.079390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.086960 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202440 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202491 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4brgx\" (UniqueName: \"kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202515 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202539 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202557 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvn66\" (UniqueName: \"kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202639 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202725 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.202816 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4brgx\" (UniqueName: \"kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305617 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305649 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305671 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305698 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvn66\" (UniqueName: \"kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305779 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305877 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305964 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.305999 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.306029 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.306057 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.306252 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.306927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.307254 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.306252 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.308169 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.308441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.311562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.312096 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.313294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.314833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.314949 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.315487 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.320264 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.323676 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.330842 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4brgx\" (UniqueName: \"kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.339691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvn66\" (UniqueName: \"kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.359369 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.375133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.388696 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.402822 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.593768 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6dkln"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.603299 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6dkln"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.686939 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-9wkrh"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.691523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.694150 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mn9kr" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.694336 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.694559 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.694715 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.694822 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.703504 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9wkrh"] Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814462 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhs6p\" (UniqueName: \"kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814801 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814828 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814857 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814923 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.814969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917178 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917271 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.917325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhs6p\" (UniqueName: \"kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.921766 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.921838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.922602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.923261 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.929384 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:30 crc kubenswrapper[4795]: I0310 15:26:30.935486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhs6p\" (UniqueName: \"kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p\") pod \"keystone-bootstrap-9wkrh\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.017331 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.488667 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1690b419-fdd1-4a50-9a57-1ef4c2a9c49e" path="/var/lib/kubelet/pods/1690b419-fdd1-4a50-9a57-1ef4c2a9c49e/volumes" Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.489554 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52cb80dd-dcc6-477f-83d4-810da076df4f" path="/var/lib/kubelet/pods/52cb80dd-dcc6-477f-83d4-810da076df4f/volumes" Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.490275 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f8600b-d744-4318-a08b-4f0a6da6bb9b" path="/var/lib/kubelet/pods/72f8600b-d744-4318-a08b-4f0a6da6bb9b/volumes" Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.997523 4795 generic.go:334] "Generic (PLEG): container finished" podID="07ac8661-08c8-49aa-ae40-cf472895a954" containerID="73e3d665d4e95bccb029e0366456de02da3d428fe7eb1c17ddf7df05d991dd13" exitCode=0 Mar 10 15:26:31 crc kubenswrapper[4795]: I0310 15:26:31.997576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tk68s" event={"ID":"07ac8661-08c8-49aa-ae40-cf472895a954","Type":"ContainerDied","Data":"73e3d665d4e95bccb029e0366456de02da3d428fe7eb1c17ddf7df05d991dd13"} Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.209347 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.217788 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.334855 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc24m\" (UniqueName: \"kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m\") pod \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.334940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key\") pod \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data\") pod \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335053 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpsmq\" (UniqueName: \"kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq\") pod \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335096 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs\") pod \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335152 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data\") pod \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335197 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts\") pod \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335253 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs\") pod \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335287 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts\") pod \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\" (UID: \"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.335324 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key\") pod \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\" (UID: \"9f67b33e-b7f7-4f63-9e76-886e5e916da2\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.336484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs" (OuterVolumeSpecName: "logs") pod "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" (UID: "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.337037 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data" (OuterVolumeSpecName: "config-data") pod "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" (UID: "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.337043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs" (OuterVolumeSpecName: "logs") pod "9f67b33e-b7f7-4f63-9e76-886e5e916da2" (UID: "9f67b33e-b7f7-4f63-9e76-886e5e916da2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.337657 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts" (OuterVolumeSpecName: "scripts") pod "9f67b33e-b7f7-4f63-9e76-886e5e916da2" (UID: "9f67b33e-b7f7-4f63-9e76-886e5e916da2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.337678 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts" (OuterVolumeSpecName: "scripts") pod "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" (UID: "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.338104 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data" (OuterVolumeSpecName: "config-data") pod "9f67b33e-b7f7-4f63-9e76-886e5e916da2" (UID: "9f67b33e-b7f7-4f63-9e76-886e5e916da2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.345240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" (UID: "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.345307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9f67b33e-b7f7-4f63-9e76-886e5e916da2" (UID: "9f67b33e-b7f7-4f63-9e76-886e5e916da2"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.345337 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m" (OuterVolumeSpecName: "kube-api-access-gc24m") pod "9f67b33e-b7f7-4f63-9e76-886e5e916da2" (UID: "9f67b33e-b7f7-4f63-9e76-886e5e916da2"). InnerVolumeSpecName "kube-api-access-gc24m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.345320 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq" (OuterVolumeSpecName: "kube-api-access-dpsmq") pod "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" (UID: "8c5fe7dd-380e-4ae8-baa5-3a518e340d3a"). InnerVolumeSpecName "kube-api-access-dpsmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.438430 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9f67b33e-b7f7-4f63-9e76-886e5e916da2-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439605 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc24m\" (UniqueName: \"kubernetes.io/projected/9f67b33e-b7f7-4f63-9e76-886e5e916da2-kube-api-access-gc24m\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439624 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439636 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439648 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpsmq\" (UniqueName: \"kubernetes.io/projected/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-kube-api-access-dpsmq\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439657 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439665 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439673 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9f67b33e-b7f7-4f63-9e76-886e5e916da2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439681 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9f67b33e-b7f7-4f63-9e76-886e5e916da2-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.439688 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: E0310 15:26:37.578227 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 10 15:26:37 crc kubenswrapper[4795]: E0310 15:26:37.578375 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nd5h64fhcch85hc5h67dh66fh68bh57bh5dfh664h598h5c9h5d6h568h54h5c8h7dh664hc9h679h97h5f9h644h569h559h644hf7h66dh67h4h7cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdqbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(79684c90-dc4f-4187-a086-c0777de981e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:37 crc kubenswrapper[4795]: E0310 15:26:37.585471 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c5fe7dd_380e_4ae8_baa5_3a518e340d3a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c5fe7dd_380e_4ae8_baa5_3a518e340d3a.slice/crio-5ac6658198ea8b1e64aad25eef1c9c2925fb8f78ee973de5c716a691f72ad44a\": RecentStats: unable to find data in memory cache]" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.659186 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.665911 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.670630 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.747749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.748947 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749136 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749422 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts\") pod \"79398470-9b8c-45b8-99c2-90b6a0927da4\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749566 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key\") pod \"79398470-9b8c-45b8-99c2-90b6a0927da4\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749718 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749858 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgfz2\" (UniqueName: \"kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.749974 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zqtk\" (UniqueName: \"kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk\") pod \"07ac8661-08c8-49aa-ae40-cf472895a954\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.750119 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle\") pod \"07ac8661-08c8-49aa-ae40-cf472895a954\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.750242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data\") pod \"79398470-9b8c-45b8-99c2-90b6a0927da4\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.751301 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs\") pod \"79398470-9b8c-45b8-99c2-90b6a0927da4\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.751486 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb\") pod \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\" (UID: \"f1e1d893-1c72-4cea-aa9a-614fc05bd08c\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.751613 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config\") pod \"07ac8661-08c8-49aa-ae40-cf472895a954\" (UID: \"07ac8661-08c8-49aa-ae40-cf472895a954\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.751793 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvwmm\" (UniqueName: \"kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm\") pod \"79398470-9b8c-45b8-99c2-90b6a0927da4\" (UID: \"79398470-9b8c-45b8-99c2-90b6a0927da4\") " Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.756042 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2" (OuterVolumeSpecName: "kube-api-access-xgfz2") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "kube-api-access-xgfz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.756599 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts" (OuterVolumeSpecName: "scripts") pod "79398470-9b8c-45b8-99c2-90b6a0927da4" (UID: "79398470-9b8c-45b8-99c2-90b6a0927da4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.759268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm" (OuterVolumeSpecName: "kube-api-access-wvwmm") pod "79398470-9b8c-45b8-99c2-90b6a0927da4" (UID: "79398470-9b8c-45b8-99c2-90b6a0927da4"). InnerVolumeSpecName "kube-api-access-wvwmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.762856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "79398470-9b8c-45b8-99c2-90b6a0927da4" (UID: "79398470-9b8c-45b8-99c2-90b6a0927da4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.765933 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs" (OuterVolumeSpecName: "logs") pod "79398470-9b8c-45b8-99c2-90b6a0927da4" (UID: "79398470-9b8c-45b8-99c2-90b6a0927da4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.766233 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk" (OuterVolumeSpecName: "kube-api-access-4zqtk") pod "07ac8661-08c8-49aa-ae40-cf472895a954" (UID: "07ac8661-08c8-49aa-ae40-cf472895a954"). InnerVolumeSpecName "kube-api-access-4zqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.766651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data" (OuterVolumeSpecName: "config-data") pod "79398470-9b8c-45b8-99c2-90b6a0927da4" (UID: "79398470-9b8c-45b8-99c2-90b6a0927da4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.792969 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07ac8661-08c8-49aa-ae40-cf472895a954" (UID: "07ac8661-08c8-49aa-ae40-cf472895a954"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.799114 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config" (OuterVolumeSpecName: "config") pod "07ac8661-08c8-49aa-ae40-cf472895a954" (UID: "07ac8661-08c8-49aa-ae40-cf472895a954"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.805580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.805667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.808743 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.816929 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.819373 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config" (OuterVolumeSpecName: "config") pod "f1e1d893-1c72-4cea-aa9a-614fc05bd08c" (UID: "f1e1d893-1c72-4cea-aa9a-614fc05bd08c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854687 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvwmm\" (UniqueName: \"kubernetes.io/projected/79398470-9b8c-45b8-99c2-90b6a0927da4-kube-api-access-wvwmm\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854755 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854765 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854776 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854786 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854795 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/79398470-9b8c-45b8-99c2-90b6a0927da4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854803 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854811 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgfz2\" (UniqueName: \"kubernetes.io/projected/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-kube-api-access-xgfz2\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854820 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zqtk\" (UniqueName: \"kubernetes.io/projected/07ac8661-08c8-49aa-ae40-cf472895a954-kube-api-access-4zqtk\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854829 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854843 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/79398470-9b8c-45b8-99c2-90b6a0927da4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854852 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79398470-9b8c-45b8-99c2-90b6a0927da4-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854860 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1e1d893-1c72-4cea-aa9a-614fc05bd08c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.854868 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/07ac8661-08c8-49aa-ae40-cf472895a954-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.880948 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.128:5353: i/o timeout" Mar 10 15:26:37 crc kubenswrapper[4795]: I0310 15:26:37.881081 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.053050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7dcccd79bc-8z54g" event={"ID":"9f67b33e-b7f7-4f63-9e76-886e5e916da2","Type":"ContainerDied","Data":"458c6ba0162cc1fef29d218f364fd15a3344f36655dc1262f1eb657f608e5c45"} Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.053240 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7dcccd79bc-8z54g" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.054853 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6846f8f877-gl4gz" event={"ID":"8c5fe7dd-380e-4ae8-baa5-3a518e340d3a","Type":"ContainerDied","Data":"5ac6658198ea8b1e64aad25eef1c9c2925fb8f78ee973de5c716a691f72ad44a"} Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.054870 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6846f8f877-gl4gz" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.058704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" event={"ID":"f1e1d893-1c72-4cea-aa9a-614fc05bd08c","Type":"ContainerDied","Data":"fc0689864f2b1543d7bcfb93fe4a7903f0ae3e9fb7d0d2e8a576020ddcadbcc8"} Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.058988 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-f2x9q" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.066833 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cc9fc6c77-knd89" event={"ID":"79398470-9b8c-45b8-99c2-90b6a0927da4","Type":"ContainerDied","Data":"736e294b9df55d7270d3c2a594a45779f54fcb79c8f8a131476b3eeb81b615db"} Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.066927 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cc9fc6c77-knd89" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.069605 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-tk68s" event={"ID":"07ac8661-08c8-49aa-ae40-cf472895a954","Type":"ContainerDied","Data":"d087ae3107aef3faba20d9dea30e682158956e3fbb1394ab84673ef416a53646"} Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.069674 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d087ae3107aef3faba20d9dea30e682158956e3fbb1394ab84673ef416a53646" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.069742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-tk68s" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.127384 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.148476 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7dcccd79bc-8z54g"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.171522 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.187804 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6846f8f877-gl4gz"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.203247 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.222855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-f2x9q"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.242924 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.250235 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cc9fc6c77-knd89"] Mar 10 15:26:38 crc kubenswrapper[4795]: E0310 15:26:38.942360 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 10 15:26:38 crc kubenswrapper[4795]: E0310 15:26:38.942506 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw449,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mg79m_openstack(7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:26:38 crc kubenswrapper[4795]: E0310 15:26:38.944118 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mg79m" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" Mar 10 15:26:38 crc kubenswrapper[4795]: I0310 15:26:38.947435 4795 scope.go:117] "RemoveContainer" containerID="c405256780b813c61c4f47ea069dfd731cc2f6405f5169844bf7a4237545d35d" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.073784 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:26:39 crc kubenswrapper[4795]: E0310 15:26:39.074489 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07ac8661-08c8-49aa-ae40-cf472895a954" containerName="neutron-db-sync" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.074515 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="07ac8661-08c8-49aa-ae40-cf472895a954" containerName="neutron-db-sync" Mar 10 15:26:39 crc kubenswrapper[4795]: E0310 15:26:39.074571 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="init" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.074581 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="init" Mar 10 15:26:39 crc kubenswrapper[4795]: E0310 15:26:39.074601 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.074609 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.074974 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="07ac8661-08c8-49aa-ae40-cf472895a954" containerName="neutron-db-sync" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.075008 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" containerName="dnsmasq-dns" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.092308 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.111876 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.172516 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.175247 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.181783 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.181864 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.181927 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.181949 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n64x\" (UniqueName: \"kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.181994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.182030 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.184310 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.185800 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.185987 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v9g98" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.193979 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.194377 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:26:39 crc kubenswrapper[4795]: E0310 15:26:39.246466 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-mg79m" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.247610 4795 scope.go:117] "RemoveContainer" containerID="16a729f72fc0271e64c35902563148df00d1d2da53c0acd6891dc03dbb6cf59e" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283582 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283876 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfsl\" (UniqueName: \"kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283935 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.283989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.284014 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n64x\" (UniqueName: \"kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.284125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.284165 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.284187 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.284967 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.285235 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.285311 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.285992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.286203 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.286209 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.303794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n64x\" (UniqueName: \"kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x\") pod \"dnsmasq-dns-6b7b667979-k5mr9\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.383621 4795 scope.go:117] "RemoveContainer" containerID="e28021df84b69443e74e2db18d55c6154ebe19ca88de4665c973f2fcbe4449ed" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.386894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfsl\" (UniqueName: \"kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.386936 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.386967 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.387033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.387051 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.393036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.393060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.395916 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.407133 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.412740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfsl\" (UniqueName: \"kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl\") pod \"neutron-7968b684f6-5dwff\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.491467 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79398470-9b8c-45b8-99c2-90b6a0927da4" path="/var/lib/kubelet/pods/79398470-9b8c-45b8-99c2-90b6a0927da4/volumes" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.492055 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5fe7dd-380e-4ae8-baa5-3a518e340d3a" path="/var/lib/kubelet/pods/8c5fe7dd-380e-4ae8-baa5-3a518e340d3a/volumes" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.493808 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f67b33e-b7f7-4f63-9e76-886e5e916da2" path="/var/lib/kubelet/pods/9f67b33e-b7f7-4f63-9e76-886e5e916da2/volumes" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.494217 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1e1d893-1c72-4cea-aa9a-614fc05bd08c" path="/var/lib/kubelet/pods/f1e1d893-1c72-4cea-aa9a-614fc05bd08c/volumes" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.501174 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.507293 4795 scope.go:117] "RemoveContainer" containerID="a12932f70a4f14d304c8d96c2d950d003db286688eb85d5c5b8e8e0c297fb459" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.514319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.549324 4795 scope.go:117] "RemoveContainer" containerID="5cf2a7c54e1b8c72b321e93336644b02065ab9ccdd40b337a73d7903681233c3" Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.737525 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5869d54dfb-2wjww"] Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.752495 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-9wkrh"] Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.820820 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:26:39 crc kubenswrapper[4795]: W0310 15:26:39.825293 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a9e70da_59e9_47fb_a8cd_89f3577ddcf8.slice/crio-117f2de952018f33e00c22822eb7bad69e5fabe1de2950d37260c33cc292996f WatchSource:0}: Error finding container 117f2de952018f33e00c22822eb7bad69e5fabe1de2950d37260c33cc292996f: Status 404 returned error can't find the container with id 117f2de952018f33e00c22822eb7bad69e5fabe1de2950d37260c33cc292996f Mar 10 15:26:39 crc kubenswrapper[4795]: I0310 15:26:39.946875 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:26:39 crc kubenswrapper[4795]: W0310 15:26:39.961173 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c0c78d2_7838_4836_975b_87312ba1c49e.slice/crio-5603890d477e9f6b0854b1cfcc7b7d716807379efd2cb6cfe7b92d745a827ab8 WatchSource:0}: Error finding container 5603890d477e9f6b0854b1cfcc7b7d716807379efd2cb6cfe7b92d745a827ab8: Status 404 returned error can't find the container with id 5603890d477e9f6b0854b1cfcc7b7d716807379efd2cb6cfe7b92d745a827ab8 Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.040326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.208444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerStarted","Data":"5603890d477e9f6b0854b1cfcc7b7d716807379efd2cb6cfe7b92d745a827ab8"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.209845 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerStarted","Data":"117f2de952018f33e00c22822eb7bad69e5fabe1de2950d37260c33cc292996f"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.214171 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9ktzf" event={"ID":"e3e5de16-defe-4daa-94cc-3d50e3461dbd","Type":"ContainerStarted","Data":"4d2bb03ea1839fd98374fa87dd1f961fae081be8e384b53ef71953122c88e056"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.219058 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5869d54dfb-2wjww" event={"ID":"379541ea-de81-488c-b6dc-2f5873fdfbeb","Type":"ContainerStarted","Data":"b49149018663ab748d3d94423fb74bce51b8dba651a6409069e13e2ccc6ff650"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.226346 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t2rg9" event={"ID":"f93d5e00-4866-406a-ad39-c3ab0b2156b0","Type":"ContainerStarted","Data":"2d097c82909a3759221c437fade51ca1cee2763b48aca5349f14ab15230759bd"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.229624 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-9ktzf" podStartSLOduration=4.255100786 podStartE2EDuration="28.229611392s" podCreationTimestamp="2026-03-10 15:26:12 +0000 UTC" firstStartedPulling="2026-03-10 15:26:13.595324486 +0000 UTC m=+1206.761065374" lastFinishedPulling="2026-03-10 15:26:37.569835082 +0000 UTC m=+1230.735575980" observedRunningTime="2026-03-10 15:26:40.226310367 +0000 UTC m=+1233.392051265" watchObservedRunningTime="2026-03-10 15:26:40.229611392 +0000 UTC m=+1233.395352290" Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.235291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9wkrh" event={"ID":"f9c3084a-a7d5-4703-836b-951571462fee","Type":"ContainerStarted","Data":"46f07e238062e9e174acd6f8104a4bf7a444b7d87dd9aced24ab8acbdfaea4e0"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.235341 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9wkrh" event={"ID":"f9c3084a-a7d5-4703-836b-951571462fee","Type":"ContainerStarted","Data":"ecde624beeb7e495c74b78283e03762088358ca501f841b4862931f1b247b9ac"} Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.242344 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.248856 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-t2rg9" podStartSLOduration=4.454687463 podStartE2EDuration="28.248837791s" podCreationTimestamp="2026-03-10 15:26:12 +0000 UTC" firstStartedPulling="2026-03-10 15:26:13.772315487 +0000 UTC m=+1206.938056385" lastFinishedPulling="2026-03-10 15:26:37.566465815 +0000 UTC m=+1230.732206713" observedRunningTime="2026-03-10 15:26:40.248477911 +0000 UTC m=+1233.414218799" watchObservedRunningTime="2026-03-10 15:26:40.248837791 +0000 UTC m=+1233.414578679" Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.272178 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-9wkrh" podStartSLOduration=10.272156128 podStartE2EDuration="10.272156128s" podCreationTimestamp="2026-03-10 15:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:40.264789547 +0000 UTC m=+1233.430530445" watchObservedRunningTime="2026-03-10 15:26:40.272156128 +0000 UTC m=+1233.437897026" Mar 10 15:26:40 crc kubenswrapper[4795]: W0310 15:26:40.459127 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ebc3860_b7d7_4d35_b77a_3413647b0be4.slice/crio-8849b7a7d4087ecf37b2349bfce97f7b07f41c816828d16297074301e4d9d48c WatchSource:0}: Error finding container 8849b7a7d4087ecf37b2349bfce97f7b07f41c816828d16297074301e4d9d48c: Status 404 returned error can't find the container with id 8849b7a7d4087ecf37b2349bfce97f7b07f41c816828d16297074301e4d9d48c Mar 10 15:26:40 crc kubenswrapper[4795]: I0310 15:26:40.740868 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.222554 4795 scope.go:117] "RemoveContainer" containerID="3d31e4d5fbe6c4068da3104fbf96a6a0c42498e9bc8b835bbc0f00b31d232baf" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.273497 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.283440 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.288192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.289589 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.290632 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.324530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerStarted","Data":"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328220 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328267 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328412 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s729g\" (UniqueName: \"kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.328682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.335119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerStarted","Data":"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.335163 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerStarted","Data":"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.368956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerStarted","Data":"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.391316 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67547556b6-45876" podStartSLOduration=19.779292649 podStartE2EDuration="20.391295568s" podCreationTimestamp="2026-03-10 15:26:21 +0000 UTC" firstStartedPulling="2026-03-10 15:26:39.964058319 +0000 UTC m=+1233.129799217" lastFinishedPulling="2026-03-10 15:26:40.576061238 +0000 UTC m=+1233.741802136" observedRunningTime="2026-03-10 15:26:41.36376122 +0000 UTC m=+1234.529502118" watchObservedRunningTime="2026-03-10 15:26:41.391295568 +0000 UTC m=+1234.557036466" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.396899 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerStarted","Data":"546cd9fc01e7eb5fb85f6cec9477bb99c629b460a781f59f1023b01c37c4fc90"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.396945 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerStarted","Data":"31f007b4e6d09cb48e4de2181e5984486844b6e2026abbca00e17b4e06a47d6b"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.396955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerStarted","Data":"8849b7a7d4087ecf37b2349bfce97f7b07f41c816828d16297074301e4d9d48c"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.397790 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.405528 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerStarted","Data":"453e204a9b34daf87d6b4dba478697f2414d230aa2b56ac8e6b184deb8066183"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.411440 4795 generic.go:334] "Generic (PLEG): container finished" podID="69bec5a2-e73e-4319-95b2-093ff9223751" containerID="ef412769130fb84e3fc2a68f44bdd2e033909e334644a4ef206ce9ed2b61c984" exitCode=0 Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.411567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" event={"ID":"69bec5a2-e73e-4319-95b2-093ff9223751","Type":"ContainerDied","Data":"ef412769130fb84e3fc2a68f44bdd2e033909e334644a4ef206ce9ed2b61c984"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.411604 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" event={"ID":"69bec5a2-e73e-4319-95b2-093ff9223751","Type":"ContainerStarted","Data":"a533556b8a747ea6c9075bde789e3a64664fd67df67761a9dc33f9e8254cd983"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.426309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5869d54dfb-2wjww" event={"ID":"379541ea-de81-488c-b6dc-2f5873fdfbeb","Type":"ContainerStarted","Data":"8b5f970b0268e6106914f0b598eaecb88ed530afe1ac77d4e8a061caf5de8a43"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.426355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5869d54dfb-2wjww" event={"ID":"379541ea-de81-488c-b6dc-2f5873fdfbeb","Type":"ContainerStarted","Data":"fda45bfcb429056e077c887b6d9ab04ec6b270194cf00ad1620aa8a388161758"} Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430219 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430254 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430301 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430334 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s729g\" (UniqueName: \"kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430384 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.430453 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.436103 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.439759 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7968b684f6-5dwff" podStartSLOduration=3.439735893 podStartE2EDuration="3.439735893s" podCreationTimestamp="2026-03-10 15:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:41.42670495 +0000 UTC m=+1234.592445858" watchObservedRunningTime="2026-03-10 15:26:41.439735893 +0000 UTC m=+1234.605476791" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.439915 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.441078 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.441351 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.447695 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.447772 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.456815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s729g\" (UniqueName: \"kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g\") pod \"neutron-5c97bfbc45-gfdc4\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.459791 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5869d54dfb-2wjww" podStartSLOduration=19.641588980999998 podStartE2EDuration="20.459772785s" podCreationTimestamp="2026-03-10 15:26:21 +0000 UTC" firstStartedPulling="2026-03-10 15:26:39.752618213 +0000 UTC m=+1232.918359111" lastFinishedPulling="2026-03-10 15:26:40.570802017 +0000 UTC m=+1233.736542915" observedRunningTime="2026-03-10 15:26:41.448190084 +0000 UTC m=+1234.613930992" watchObservedRunningTime="2026-03-10 15:26:41.459772785 +0000 UTC m=+1234.625513683" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.471225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.471288 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67547556b6-45876" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.554168 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.554223 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:26:41 crc kubenswrapper[4795]: I0310 15:26:41.637889 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.289030 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.460334 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerStarted","Data":"b73362a15e1523da055b029775569493559757e75675e22b21bdf30ecc6a3fbe"} Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.463294 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerStarted","Data":"89070222faae0f834836fee0f929b509c5216d58609663fc3db3d525722ddcdf"} Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.466535 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" event={"ID":"69bec5a2-e73e-4319-95b2-093ff9223751","Type":"ContainerStarted","Data":"22cd34a61e4a54f714608081ebd8444a7aa3a9ab9e495bb0c4e4919582395480"} Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.466685 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.474158 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerStarted","Data":"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2"} Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.482175 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" podStartSLOduration=4.482162299 podStartE2EDuration="4.482162299s" podCreationTimestamp="2026-03-10 15:26:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:42.481086898 +0000 UTC m=+1235.646827796" watchObservedRunningTime="2026-03-10 15:26:42.482162299 +0000 UTC m=+1235.647903197" Mar 10 15:26:42 crc kubenswrapper[4795]: I0310 15:26:42.506581 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=12.506565076 podStartE2EDuration="12.506565076s" podCreationTimestamp="2026-03-10 15:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:42.505947619 +0000 UTC m=+1235.671688517" watchObservedRunningTime="2026-03-10 15:26:42.506565076 +0000 UTC m=+1235.672305974" Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.485697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerStarted","Data":"4cc6b3909b7fae8597719b3798d9b553711ed002f88047133e6c9d36dd5de069"} Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.490428 4795 generic.go:334] "Generic (PLEG): container finished" podID="f93d5e00-4866-406a-ad39-c3ab0b2156b0" containerID="2d097c82909a3759221c437fade51ca1cee2763b48aca5349f14ab15230759bd" exitCode=0 Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.490463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t2rg9" event={"ID":"f93d5e00-4866-406a-ad39-c3ab0b2156b0","Type":"ContainerDied","Data":"2d097c82909a3759221c437fade51ca1cee2763b48aca5349f14ab15230759bd"} Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.495297 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerStarted","Data":"7b8f5325d3474a9157d36c929c0d92da4e5b6b211af56ce7d8b8ec9ba7aa95be"} Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.495365 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerStarted","Data":"ffd79a8c9c339f8985e86f020f3ac03f340c3dce531af0d8b24567498c3ce454"} Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.508040 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=13.50801874 podStartE2EDuration="13.50801874s" podCreationTimestamp="2026-03-10 15:26:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:43.502803851 +0000 UTC m=+1236.668544739" watchObservedRunningTime="2026-03-10 15:26:43.50801874 +0000 UTC m=+1236.673759638" Mar 10 15:26:43 crc kubenswrapper[4795]: I0310 15:26:43.553472 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5c97bfbc45-gfdc4" podStartSLOduration=2.553453659 podStartE2EDuration="2.553453659s" podCreationTimestamp="2026-03-10 15:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:43.519485138 +0000 UTC m=+1236.685226036" watchObservedRunningTime="2026-03-10 15:26:43.553453659 +0000 UTC m=+1236.719194557" Mar 10 15:26:44 crc kubenswrapper[4795]: I0310 15:26:44.504003 4795 generic.go:334] "Generic (PLEG): container finished" podID="f9c3084a-a7d5-4703-836b-951571462fee" containerID="46f07e238062e9e174acd6f8104a4bf7a444b7d87dd9aced24ab8acbdfaea4e0" exitCode=0 Mar 10 15:26:44 crc kubenswrapper[4795]: I0310 15:26:44.504099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9wkrh" event={"ID":"f9c3084a-a7d5-4703-836b-951571462fee","Type":"ContainerDied","Data":"46f07e238062e9e174acd6f8104a4bf7a444b7d87dd9aced24ab8acbdfaea4e0"} Mar 10 15:26:44 crc kubenswrapper[4795]: I0310 15:26:44.510013 4795 generic.go:334] "Generic (PLEG): container finished" podID="e3e5de16-defe-4daa-94cc-3d50e3461dbd" containerID="4d2bb03ea1839fd98374fa87dd1f961fae081be8e384b53ef71953122c88e056" exitCode=0 Mar 10 15:26:44 crc kubenswrapper[4795]: I0310 15:26:44.510119 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9ktzf" event={"ID":"e3e5de16-defe-4daa-94cc-3d50e3461dbd","Type":"ContainerDied","Data":"4d2bb03ea1839fd98374fa87dd1f961fae081be8e384b53ef71953122c88e056"} Mar 10 15:26:44 crc kubenswrapper[4795]: I0310 15:26:44.512683 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.508176 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.565972 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-t2rg9" event={"ID":"f93d5e00-4866-406a-ad39-c3ab0b2156b0","Type":"ContainerDied","Data":"27fd645679541b9bd189a858f2d4582821dc36c73f0ff78d7cc7f7164f5c2b48"} Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.566020 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27fd645679541b9bd189a858f2d4582821dc36c73f0ff78d7cc7f7164f5c2b48" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.566165 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-t2rg9" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.614818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs\") pod \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.615316 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts\") pod \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.615361 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnnz\" (UniqueName: \"kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz\") pod \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.615382 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data\") pod \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.615580 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle\") pod \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\" (UID: \"f93d5e00-4866-406a-ad39-c3ab0b2156b0\") " Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.618843 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs" (OuterVolumeSpecName: "logs") pod "f93d5e00-4866-406a-ad39-c3ab0b2156b0" (UID: "f93d5e00-4866-406a-ad39-c3ab0b2156b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.644236 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz" (OuterVolumeSpecName: "kube-api-access-hxnnz") pod "f93d5e00-4866-406a-ad39-c3ab0b2156b0" (UID: "f93d5e00-4866-406a-ad39-c3ab0b2156b0"). InnerVolumeSpecName "kube-api-access-hxnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.644247 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts" (OuterVolumeSpecName: "scripts") pod "f93d5e00-4866-406a-ad39-c3ab0b2156b0" (UID: "f93d5e00-4866-406a-ad39-c3ab0b2156b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.707197 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f93d5e00-4866-406a-ad39-c3ab0b2156b0" (UID: "f93d5e00-4866-406a-ad39-c3ab0b2156b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.721488 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.721588 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnnz\" (UniqueName: \"kubernetes.io/projected/f93d5e00-4866-406a-ad39-c3ab0b2156b0-kube-api-access-hxnnz\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.721670 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.721740 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f93d5e00-4866-406a-ad39-c3ab0b2156b0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.724415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data" (OuterVolumeSpecName: "config-data") pod "f93d5e00-4866-406a-ad39-c3ab0b2156b0" (UID: "f93d5e00-4866-406a-ad39-c3ab0b2156b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.744751 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:26:45 crc kubenswrapper[4795]: E0310 15:26:45.752061 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f93d5e00-4866-406a-ad39-c3ab0b2156b0" containerName="placement-db-sync" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.752171 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f93d5e00-4866-406a-ad39-c3ab0b2156b0" containerName="placement-db-sync" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.752729 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f93d5e00-4866-406a-ad39-c3ab0b2156b0" containerName="placement-db-sync" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.754340 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.756432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.757842 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.788635 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.824010 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f93d5e00-4866-406a-ad39-c3ab0b2156b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925594 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925632 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925712 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925726 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925757 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:45 crc kubenswrapper[4795]: I0310 15:26:45.925825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsv9j\" (UniqueName: \"kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.027824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsv9j\" (UniqueName: \"kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.027910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.027934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.027955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.027989 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.028004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.028035 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.028410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.033518 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.034748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.036165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.036848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.037263 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.045001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsv9j\" (UniqueName: \"kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j\") pod \"placement-7946f6d44-rgccw\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:46 crc kubenswrapper[4795]: I0310 15:26:46.099987 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:48 crc kubenswrapper[4795]: I0310 15:26:48.539027 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:26:48 crc kubenswrapper[4795]: I0310 15:26:48.539559 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:26:48 crc kubenswrapper[4795]: I0310 15:26:48.539614 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:26:48 crc kubenswrapper[4795]: I0310 15:26:48.540364 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:26:48 crc kubenswrapper[4795]: I0310 15:26:48.540429 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b" gracePeriod=600 Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.483355 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.502361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.503313 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.579613 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.592641 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="dnsmasq-dns" containerID="cri-o://48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a" gracePeriod=10 Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.596754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle\") pod \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.596825 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.596854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.596895 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj5ls\" (UniqueName: \"kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls\") pod \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.596922 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.597028 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data\") pod \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\" (UID: \"e3e5de16-defe-4daa-94cc-3d50e3461dbd\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.597118 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.597135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.597201 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhs6p\" (UniqueName: \"kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p\") pod \"f9c3084a-a7d5-4703-836b-951571462fee\" (UID: \"f9c3084a-a7d5-4703-836b-951571462fee\") " Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.608512 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.613307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts" (OuterVolumeSpecName: "scripts") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.615683 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls" (OuterVolumeSpecName: "kube-api-access-mj5ls") pod "e3e5de16-defe-4daa-94cc-3d50e3461dbd" (UID: "e3e5de16-defe-4daa-94cc-3d50e3461dbd"). InnerVolumeSpecName "kube-api-access-mj5ls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.619625 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.623878 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b" exitCode=0 Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.624036 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b"} Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.624861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b"} Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.624893 4795 scope.go:117] "RemoveContainer" containerID="552e3b68186e04166961655f3dfc811f8b5622423236f01ad885373bab0e984d" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.636745 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e3e5de16-defe-4daa-94cc-3d50e3461dbd" (UID: "e3e5de16-defe-4daa-94cc-3d50e3461dbd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.637098 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-9wkrh" event={"ID":"f9c3084a-a7d5-4703-836b-951571462fee","Type":"ContainerDied","Data":"ecde624beeb7e495c74b78283e03762088358ca501f841b4862931f1b247b9ac"} Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.637132 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecde624beeb7e495c74b78283e03762088358ca501f841b4862931f1b247b9ac" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.637353 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-9wkrh" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.640580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p" (OuterVolumeSpecName: "kube-api-access-dhs6p") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "kube-api-access-dhs6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.647532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerStarted","Data":"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824"} Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.662111 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-9ktzf" event={"ID":"e3e5de16-defe-4daa-94cc-3d50e3461dbd","Type":"ContainerDied","Data":"f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530"} Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.662453 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f48d5ee36ddd562f8c5d22cb6afc774802a00bade9f649d501641eb9453bd530" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.662994 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-9ktzf" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.683223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data" (OuterVolumeSpecName: "config-data") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.685478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f9c3084a-a7d5-4703-836b-951571462fee" (UID: "f9c3084a-a7d5-4703-836b-951571462fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.699642 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.699883 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.699985 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.700099 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhs6p\" (UniqueName: \"kubernetes.io/projected/f9c3084a-a7d5-4703-836b-951571462fee-kube-api-access-dhs6p\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.700188 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.700265 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.700353 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj5ls\" (UniqueName: \"kubernetes.io/projected/e3e5de16-defe-4daa-94cc-3d50e3461dbd-kube-api-access-mj5ls\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.700439 4795 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f9c3084a-a7d5-4703-836b-951571462fee-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.731169 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3e5de16-defe-4daa-94cc-3d50e3461dbd" (UID: "e3e5de16-defe-4daa-94cc-3d50e3461dbd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.776916 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:26:49 crc kubenswrapper[4795]: I0310 15:26:49.802480 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e5de16-defe-4daa-94cc-3d50e3461dbd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.075395 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211150 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhkh8\" (UniqueName: \"kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211193 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211233 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.211305 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config\") pod \"bb23750d-b817-443b-8876-e16a7775629f\" (UID: \"bb23750d-b817-443b-8876-e16a7775629f\") " Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.221596 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8" (OuterVolumeSpecName: "kube-api-access-jhkh8") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "kube-api-access-jhkh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.278017 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.299621 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.302555 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.310591 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.316772 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.316809 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhkh8\" (UniqueName: \"kubernetes.io/projected/bb23750d-b817-443b-8876-e16a7775629f-kube-api-access-jhkh8\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.316821 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.316830 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.316864 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.330770 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config" (OuterVolumeSpecName: "config") pod "bb23750d-b817-443b-8876-e16a7775629f" (UID: "bb23750d-b817-443b-8876-e16a7775629f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.389844 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.391348 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.406350 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.406589 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.419369 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb23750d-b817-443b-8876-e16a7775629f-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.437703 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.454778 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.460624 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.465890 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.670945 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5b46885bb9-hzjqn"] Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.671409 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="init" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671430 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="init" Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.671470 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e5de16-defe-4daa-94cc-3d50e3461dbd" containerName="barbican-db-sync" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671479 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e5de16-defe-4daa-94cc-3d50e3461dbd" containerName="barbican-db-sync" Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.671493 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="dnsmasq-dns" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671500 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="dnsmasq-dns" Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.671511 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9c3084a-a7d5-4703-836b-951571462fee" containerName="keystone-bootstrap" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671520 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c3084a-a7d5-4703-836b-951571462fee" containerName="keystone-bootstrap" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671720 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9c3084a-a7d5-4703-836b-951571462fee" containerName="keystone-bootstrap" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e5de16-defe-4daa-94cc-3d50e3461dbd" containerName="barbican-db-sync" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.671771 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb23750d-b817-443b-8876-e16a7775629f" containerName="dnsmasq-dns" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.672471 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.675850 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.675883 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.676146 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.676237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.677126 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-mn9kr" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.677784 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.682380 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerStarted","Data":"cddd0c411964b371ac93f00d58eba7a6b7a5cfe9371f5cf3bd18b6b29a8e237c"} Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.682565 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerStarted","Data":"edb515c902b3c5c0d67470c88737b6d3184390a0ff6cbb702c71e344e7dee06b"} Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.682636 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerStarted","Data":"fa5a910646ab02725b14f67fb39ac155a34b09bc5d2bcbba72eb3eab9ee09a52"} Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.682749 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.682812 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686179 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b46885bb9-hzjqn"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686704 4795 generic.go:334] "Generic (PLEG): container finished" podID="bb23750d-b817-443b-8876-e16a7775629f" containerID="48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a" exitCode=0 Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686742 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" event={"ID":"bb23750d-b817-443b-8876-e16a7775629f","Type":"ContainerDied","Data":"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a"} Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" event={"ID":"bb23750d-b817-443b-8876-e16a7775629f","Type":"ContainerDied","Data":"76feefe9519e1da5eecbe1f5d3ee153cf077a73763adeb144ac296990942a1dd"} Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686800 4795 scope.go:117] "RemoveContainer" containerID="48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.686953 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-lrlkz" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.687002 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.687567 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.687596 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.687612 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.723099 4795 scope.go:117] "RemoveContainer" containerID="a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747375 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjck\" (UniqueName: \"kubernetes.io/projected/d045398c-e0bb-47f7-b069-67c75ba5dab5-kube-api-access-ktjck\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747453 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-config-data\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747514 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-combined-ca-bundle\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747577 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-internal-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747616 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-fernet-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747701 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-public-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747758 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-scripts\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.747830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-credential-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.823817 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7946f6d44-rgccw" podStartSLOduration=5.823792887 podStartE2EDuration="5.823792887s" podCreationTimestamp="2026-03-10 15:26:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:50.779670715 +0000 UTC m=+1243.945411613" watchObservedRunningTime="2026-03-10 15:26:50.823792887 +0000 UTC m=+1243.989533785" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.833253 4795 scope.go:117] "RemoveContainer" containerID="48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a" Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.842090 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a\": container with ID starting with 48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a not found: ID does not exist" containerID="48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.842137 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a"} err="failed to get container status \"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a\": rpc error: code = NotFound desc = could not find container \"48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a\": container with ID starting with 48afde46f279bdeeb8a17909bb566e81085bb0dea9bc1f6d1aa327e9e28dd18a not found: ID does not exist" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.842165 4795 scope.go:117] "RemoveContainer" containerID="a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850564 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-internal-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-fernet-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850687 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-public-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-scripts\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850790 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-credential-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjck\" (UniqueName: \"kubernetes.io/projected/d045398c-e0bb-47f7-b069-67c75ba5dab5-kube-api-access-ktjck\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-config-data\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.850953 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-combined-ca-bundle\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: E0310 15:26:50.852562 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e\": container with ID starting with a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e not found: ID does not exist" containerID="a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.852648 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e"} err="failed to get container status \"a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e\": rpc error: code = NotFound desc = could not find container \"a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e\": container with ID starting with a83be8cdb91213084df280f2f75f7e9b9d5798bba075eaea8c9d64e0e4b9d89e not found: ID does not exist" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.865332 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.870753 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.880172 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.882769 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.884327 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-7rp75" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.884743 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.899839 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.902593 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.906815 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-public-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.906941 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-combined-ca-bundle\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.907219 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-internal-tls-certs\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.907222 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.907572 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-scripts\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.907623 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-fernet-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.907826 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-credential-keys\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.908144 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d045398c-e0bb-47f7-b069-67c75ba5dab5-config-data\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.909315 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.929396 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjck\" (UniqueName: \"kubernetes.io/projected/d045398c-e0bb-47f7-b069-67c75ba5dab5-kube-api-access-ktjck\") pod \"keystone-5b46885bb9-hzjqn\" (UID: \"d045398c-e0bb-47f7-b069-67c75ba5dab5\") " pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954016 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954077 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954123 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckx8n\" (UniqueName: \"kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954172 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954214 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954256 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.954280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8t9r\" (UniqueName: \"kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.964193 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-lrlkz"] Mar 10 15:26:50 crc kubenswrapper[4795]: I0310 15:26:50.978244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.006991 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.017366 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.018747 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071119 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071166 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcrsv\" (UniqueName: \"kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071270 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071315 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071361 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckx8n\" (UniqueName: \"kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071393 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071456 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071498 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.071546 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8t9r\" (UniqueName: \"kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.074588 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.075476 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.093141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.098444 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.098608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.099172 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.101124 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.101794 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.102553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.106471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8t9r\" (UniqueName: \"kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r\") pod \"barbican-worker-58794994d5-4hvvj\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.111506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckx8n\" (UniqueName: \"kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n\") pod \"barbican-keystone-listener-68c5995f66-69w72\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174118 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174208 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174265 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcrsv\" (UniqueName: \"kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.174310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.175309 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.175978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.178783 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.178937 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.178970 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-76476484b5-x6dzm"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.181319 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.187447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.195134 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76476484b5-x6dzm"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.209375 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-67b995dbd6-dgj6m"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.210731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.217012 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcrsv\" (UniqueName: \"kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv\") pod \"dnsmasq-dns-848cf88cfc-mx8gf\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.217058 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67b995dbd6-dgj6m"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279160 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-logs\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0afebe-9b88-450e-88e1-641870206db5-logs\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279684 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-combined-ca-bundle\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279812 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data-custom\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279844 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6m5v\" (UniqueName: \"kubernetes.io/projected/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-kube-api-access-w6m5v\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279895 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-combined-ca-bundle\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279920 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28h2d\" (UniqueName: \"kubernetes.io/projected/bd0afebe-9b88-450e-88e1-641870206db5-kube-api-access-28h2d\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.279969 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data-custom\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.288189 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.297357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.297366 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.298381 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.300786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.368247 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0afebe-9b88-450e-88e1-641870206db5-logs\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381113 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-combined-ca-bundle\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381131 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381147 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381275 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381357 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data-custom\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381386 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6m5v\" (UniqueName: \"kubernetes.io/projected/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-kube-api-access-w6m5v\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381432 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwdh\" (UniqueName: \"kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-combined-ca-bundle\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381489 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28h2d\" (UniqueName: \"kubernetes.io/projected/bd0afebe-9b88-450e-88e1-641870206db5-kube-api-access-28h2d\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381552 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data-custom\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381567 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.381689 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-logs\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.382201 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-logs\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.383079 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd0afebe-9b88-450e-88e1-641870206db5-logs\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.391950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-combined-ca-bundle\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.400724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.404277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-combined-ca-bundle\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.409948 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd0afebe-9b88-450e-88e1-641870206db5-config-data-custom\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.412348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data-custom\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.415060 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-config-data\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.417517 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6m5v\" (UniqueName: \"kubernetes.io/projected/0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f-kube-api-access-w6m5v\") pod \"barbican-worker-76476484b5-x6dzm\" (UID: \"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f\") " pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.428579 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28h2d\" (UniqueName: \"kubernetes.io/projected/bd0afebe-9b88-450e-88e1-641870206db5-kube-api-access-28h2d\") pod \"barbican-keystone-listener-67b995dbd6-dgj6m\" (UID: \"bd0afebe-9b88-450e-88e1-641870206db5\") " pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484157 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwdh\" (UniqueName: \"kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484349 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484439 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.484774 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.485648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.494022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.495393 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.496118 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb23750d-b817-443b-8876-e16a7775629f" path="/var/lib/kubelet/pods/bb23750d-b817-443b-8876-e16a7775629f/volumes" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.527135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.531663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwdh\" (UniqueName: \"kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh\") pod \"barbican-api-d485ffcdb-tpksk\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.565781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-76476484b5-x6dzm" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.584002 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5869d54dfb-2wjww" podUID="379541ea-de81-488c-b6dc-2f5873fdfbeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.612544 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.622829 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:51 crc kubenswrapper[4795]: I0310 15:26:51.823550 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5b46885bb9-hzjqn"] Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.120841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.367221 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:26:52 crc kubenswrapper[4795]: W0310 15:26:52.388281 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod071b61e9_efc3_4421_87b8_fab46d1c1f6e.slice/crio-b1734d7265e3ecc08877ee065c66061e5a58bf8b6611768e3db15bf08c6ef560 WatchSource:0}: Error finding container b1734d7265e3ecc08877ee065c66061e5a58bf8b6611768e3db15bf08c6ef560: Status 404 returned error can't find the container with id b1734d7265e3ecc08877ee065c66061e5a58bf8b6611768e3db15bf08c6ef560 Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.589828 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.605193 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-76476484b5-x6dzm"] Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.757940 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" event={"ID":"32364ecb-f9b9-4abc-8491-bdf2105d0848","Type":"ContainerStarted","Data":"41a5a4a5b433e879c391b0b1265f875d11aa8da831128ca0ad45498baea8ce95"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.769610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b46885bb9-hzjqn" event={"ID":"d045398c-e0bb-47f7-b069-67c75ba5dab5","Type":"ContainerStarted","Data":"a07da93055afbeb004767f5ce100fd18b344ac543fc8c236b3661ebf69e297f4"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.769650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5b46885bb9-hzjqn" event={"ID":"d045398c-e0bb-47f7-b069-67c75ba5dab5","Type":"ContainerStarted","Data":"bfa1c422756e2ce413082c6fc9ac0d319ffda5961d3e7f0a5c8a865afe711c00"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.770186 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.772598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerStarted","Data":"04d37e7dbe9c20d1ac141bfd430e1092895f6795e7769429e88315902787200a"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.774532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76476484b5-x6dzm" event={"ID":"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f","Type":"ContainerStarted","Data":"8818a5ae60e0f139782ad1eabd5828a285cb02b312a4e640fb5db1475b102e8f"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.779867 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerStarted","Data":"b1734d7265e3ecc08877ee065c66061e5a58bf8b6611768e3db15bf08c6ef560"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.791358 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5b46885bb9-hzjqn" podStartSLOduration=2.7913359140000003 podStartE2EDuration="2.791335914s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:52.789147572 +0000 UTC m=+1245.954888470" watchObservedRunningTime="2026-03-10 15:26:52.791335914 +0000 UTC m=+1245.957076812" Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.795313 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.795340 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.795309 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mg79m" event={"ID":"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2","Type":"ContainerStarted","Data":"7aabf5d7cb7726116bb126dfadad56556c53264e0490c57ab36ce5b890be4f26"} Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.818571 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.824654 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mg79m" podStartSLOduration=3.779443983 podStartE2EDuration="41.824633406s" podCreationTimestamp="2026-03-10 15:26:11 +0000 UTC" firstStartedPulling="2026-03-10 15:26:13.03203888 +0000 UTC m=+1206.197779778" lastFinishedPulling="2026-03-10 15:26:51.077228303 +0000 UTC m=+1244.242969201" observedRunningTime="2026-03-10 15:26:52.813704864 +0000 UTC m=+1245.979445762" watchObservedRunningTime="2026-03-10 15:26:52.824633406 +0000 UTC m=+1245.990374324" Mar 10 15:26:52 crc kubenswrapper[4795]: W0310 15:26:52.827866 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff736888_83cd_4b42_a4b3_d1e44b632972.slice/crio-960b4e14ac584a7a84d335bde14eff76da6780799d9a61af31805219a3925acd WatchSource:0}: Error finding container 960b4e14ac584a7a84d335bde14eff76da6780799d9a61af31805219a3925acd: Status 404 returned error can't find the container with id 960b4e14ac584a7a84d335bde14eff76da6780799d9a61af31805219a3925acd Mar 10 15:26:52 crc kubenswrapper[4795]: I0310 15:26:52.853611 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-67b995dbd6-dgj6m"] Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.837706 4795 generic.go:334] "Generic (PLEG): container finished" podID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerID="749a02bca4d003e97979b1fbacfc24d38895e7be51630c1662242f3992623a15" exitCode=0 Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.838091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" event={"ID":"32364ecb-f9b9-4abc-8491-bdf2105d0848","Type":"ContainerDied","Data":"749a02bca4d003e97979b1fbacfc24d38895e7be51630c1662242f3992623a15"} Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.856660 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerStarted","Data":"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702"} Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.857686 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerStarted","Data":"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75"} Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.857807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerStarted","Data":"960b4e14ac584a7a84d335bde14eff76da6780799d9a61af31805219a3925acd"} Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.858794 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.858913 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.880674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" event={"ID":"bd0afebe-9b88-450e-88e1-641870206db5","Type":"ContainerStarted","Data":"359e81abe364b29aa525144b9a4546b15a01f1d992e1b6f55e88cb65140736ef"} Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.908227 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.908388 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:26:53 crc kubenswrapper[4795]: I0310 15:26:53.962712 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d485ffcdb-tpksk" podStartSLOduration=2.962693687 podStartE2EDuration="2.962693687s" podCreationTimestamp="2026-03-10 15:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:53.952256679 +0000 UTC m=+1247.117997577" watchObservedRunningTime="2026-03-10 15:26:53.962693687 +0000 UTC m=+1247.128434585" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.114570 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.114652 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.125340 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-b68874cc4-rzntx"] Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.127370 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.131475 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.131512 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.146943 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b68874cc4-rzntx"] Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304239 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-combined-ca-bundle\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-public-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data-custom\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv9dz\" (UniqueName: \"kubernetes.io/projected/0940a851-b873-4348-b89d-6ca90cf8646f-kube-api-access-sv9dz\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.304478 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0940a851-b873-4348-b89d-6ca90cf8646f-logs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.305148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-internal-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406572 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-combined-ca-bundle\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-public-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406668 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data-custom\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv9dz\" (UniqueName: \"kubernetes.io/projected/0940a851-b873-4348-b89d-6ca90cf8646f-kube-api-access-sv9dz\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0940a851-b873-4348-b89d-6ca90cf8646f-logs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406752 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-internal-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.406837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.407486 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0940a851-b873-4348-b89d-6ca90cf8646f-logs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.413787 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data-custom\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.416033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-config-data\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.417633 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-combined-ca-bundle\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.432883 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv9dz\" (UniqueName: \"kubernetes.io/projected/0940a851-b873-4348-b89d-6ca90cf8646f-kube-api-access-sv9dz\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.435691 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-public-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.449167 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0940a851-b873-4348-b89d-6ca90cf8646f-internal-tls-certs\") pod \"barbican-api-b68874cc4-rzntx\" (UID: \"0940a851-b873-4348-b89d-6ca90cf8646f\") " pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.470367 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.509988 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.777734 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.906995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" event={"ID":"32364ecb-f9b9-4abc-8491-bdf2105d0848","Type":"ContainerStarted","Data":"6d2ecbd119e1c6acab3a76364b2b12453e662b3d9712969f1b5b24f00ad5f26d"} Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.908550 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:26:54 crc kubenswrapper[4795]: I0310 15:26:54.947466 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" podStartSLOduration=4.947449163 podStartE2EDuration="4.947449163s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:54.932119285 +0000 UTC m=+1248.097860193" watchObservedRunningTime="2026-03-10 15:26:54.947449163 +0000 UTC m=+1248.113190061" Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.661760 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-b68874cc4-rzntx"] Mar 10 15:26:56 crc kubenswrapper[4795]: W0310 15:26:56.682803 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0940a851_b873_4348_b89d_6ca90cf8646f.slice/crio-59e629d3a715f2a247a8d2f85f901521b17dff0d2cdf3ee3b863f17fc75b19fb WatchSource:0}: Error finding container 59e629d3a715f2a247a8d2f85f901521b17dff0d2cdf3ee3b863f17fc75b19fb: Status 404 returned error can't find the container with id 59e629d3a715f2a247a8d2f85f901521b17dff0d2cdf3ee3b863f17fc75b19fb Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.945860 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerStarted","Data":"91333c4383342b5e6e355f280ab7c8970ff1e9072cb33e1c8fc540d799975268"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.945909 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerStarted","Data":"26bea14b0a8fdacc73844dff046507f7cfc99619978d97823b4ebec17b030e55"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.957266 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" event={"ID":"bd0afebe-9b88-450e-88e1-641870206db5","Type":"ContainerStarted","Data":"d56a89d2cbe5c1f34c7ee421474a1ac875badb09e77d2928f51681e1eff3ee65"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.957304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" event={"ID":"bd0afebe-9b88-450e-88e1-641870206db5","Type":"ContainerStarted","Data":"9caabc2489eab13c113f65e0e7d3ab18ac1ec24180c54c55082424b4dd883870"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.963785 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-58794994d5-4hvvj" podStartSLOduration=3.034320182 podStartE2EDuration="6.963770636s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="2026-03-10 15:26:52.206261865 +0000 UTC m=+1245.372002763" lastFinishedPulling="2026-03-10 15:26:56.135712319 +0000 UTC m=+1249.301453217" observedRunningTime="2026-03-10 15:26:56.963251611 +0000 UTC m=+1250.128992509" watchObservedRunningTime="2026-03-10 15:26:56.963770636 +0000 UTC m=+1250.129511534" Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.970537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76476484b5-x6dzm" event={"ID":"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f","Type":"ContainerStarted","Data":"7559714f946f009ae9dd708d51f2408895d3bec6c328e9106f38d4372ca3a076"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.970580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-76476484b5-x6dzm" event={"ID":"0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f","Type":"ContainerStarted","Data":"906f0cd7623db939f1cdbd3153b3ef17918046f0a5a8e2ba1fe48c994c642fbb"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.978009 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b68874cc4-rzntx" event={"ID":"0940a851-b873-4348-b89d-6ca90cf8646f","Type":"ContainerStarted","Data":"ee81bdc4c267dd74364bf731a04f26761436a746e15bb1b3524e5428eb008d66"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.978050 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b68874cc4-rzntx" event={"ID":"0940a851-b873-4348-b89d-6ca90cf8646f","Type":"ContainerStarted","Data":"59e629d3a715f2a247a8d2f85f901521b17dff0d2cdf3ee3b863f17fc75b19fb"} Mar 10 15:26:56 crc kubenswrapper[4795]: I0310 15:26:56.980655 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerStarted","Data":"c960fa9d44515a2502f8b594165f1c50c5e07e010ca622265755fbdbfccad43e"} Mar 10 15:26:57 crc kubenswrapper[4795]: I0310 15:26:57.016260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-76476484b5-x6dzm" podStartSLOduration=2.47842602 podStartE2EDuration="6.016239196s" podCreationTimestamp="2026-03-10 15:26:51 +0000 UTC" firstStartedPulling="2026-03-10 15:26:52.604602245 +0000 UTC m=+1245.770343143" lastFinishedPulling="2026-03-10 15:26:56.142415421 +0000 UTC m=+1249.308156319" observedRunningTime="2026-03-10 15:26:57.009766301 +0000 UTC m=+1250.175507199" watchObservedRunningTime="2026-03-10 15:26:57.016239196 +0000 UTC m=+1250.181980094" Mar 10 15:26:57 crc kubenswrapper[4795]: I0310 15:26:57.018927 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-67b995dbd6-dgj6m" podStartSLOduration=2.7817130629999998 podStartE2EDuration="6.018919003s" podCreationTimestamp="2026-03-10 15:26:51 +0000 UTC" firstStartedPulling="2026-03-10 15:26:52.899570339 +0000 UTC m=+1246.065311227" lastFinishedPulling="2026-03-10 15:26:56.136776269 +0000 UTC m=+1249.302517167" observedRunningTime="2026-03-10 15:26:56.993095424 +0000 UTC m=+1250.158836312" watchObservedRunningTime="2026-03-10 15:26:57.018919003 +0000 UTC m=+1250.184659891" Mar 10 15:26:57 crc kubenswrapper[4795]: I0310 15:26:57.039536 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:26:57 crc kubenswrapper[4795]: I0310 15:26:57.058201 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.006958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-b68874cc4-rzntx" event={"ID":"0940a851-b873-4348-b89d-6ca90cf8646f","Type":"ContainerStarted","Data":"31554e594d813bdab20b09ab18399d11b7b783edafb99aeebcb49fa609ba2794"} Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.007282 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.007303 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.014861 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerStarted","Data":"e7055f6be92fed0ae9511a4f29b0e0e41242148cac775609f481ddbc4abbb7c8"} Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.039782 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-b68874cc4-rzntx" podStartSLOduration=4.039694199 podStartE2EDuration="4.039694199s" podCreationTimestamp="2026-03-10 15:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:26:58.027820959 +0000 UTC m=+1251.193561857" watchObservedRunningTime="2026-03-10 15:26:58.039694199 +0000 UTC m=+1251.205435097" Mar 10 15:26:58 crc kubenswrapper[4795]: I0310 15:26:58.046496 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" podStartSLOduration=4.3095237730000004 podStartE2EDuration="8.046475923s" podCreationTimestamp="2026-03-10 15:26:50 +0000 UTC" firstStartedPulling="2026-03-10 15:26:52.417497025 +0000 UTC m=+1245.583237933" lastFinishedPulling="2026-03-10 15:26:56.154449195 +0000 UTC m=+1249.320190083" observedRunningTime="2026-03-10 15:26:58.045300129 +0000 UTC m=+1251.211041027" watchObservedRunningTime="2026-03-10 15:26:58.046475923 +0000 UTC m=+1251.212216811" Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.027728 4795 generic.go:334] "Generic (PLEG): container finished" podID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" containerID="7aabf5d7cb7726116bb126dfadad56556c53264e0490c57ab36ce5b890be4f26" exitCode=0 Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.027997 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mg79m" event={"ID":"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2","Type":"ContainerDied","Data":"7aabf5d7cb7726116bb126dfadad56556c53264e0490c57ab36ce5b890be4f26"} Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.028319 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener-log" containerID="cri-o://c960fa9d44515a2502f8b594165f1c50c5e07e010ca622265755fbdbfccad43e" gracePeriod=30 Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.028407 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener" containerID="cri-o://e7055f6be92fed0ae9511a4f29b0e0e41242148cac775609f481ddbc4abbb7c8" gracePeriod=30 Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.028759 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-58794994d5-4hvvj" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker-log" containerID="cri-o://26bea14b0a8fdacc73844dff046507f7cfc99619978d97823b4ebec17b030e55" gracePeriod=30 Mar 10 15:26:59 crc kubenswrapper[4795]: I0310 15:26:59.028829 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-58794994d5-4hvvj" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker" containerID="cri-o://91333c4383342b5e6e355f280ab7c8970ff1e9072cb33e1c8fc540d799975268" gracePeriod=30 Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.044313 4795 generic.go:334] "Generic (PLEG): container finished" podID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerID="e7055f6be92fed0ae9511a4f29b0e0e41242148cac775609f481ddbc4abbb7c8" exitCode=0 Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.044629 4795 generic.go:334] "Generic (PLEG): container finished" podID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerID="c960fa9d44515a2502f8b594165f1c50c5e07e010ca622265755fbdbfccad43e" exitCode=143 Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.044494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerDied","Data":"e7055f6be92fed0ae9511a4f29b0e0e41242148cac775609f481ddbc4abbb7c8"} Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.044703 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerDied","Data":"c960fa9d44515a2502f8b594165f1c50c5e07e010ca622265755fbdbfccad43e"} Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.046758 4795 generic.go:334] "Generic (PLEG): container finished" podID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerID="91333c4383342b5e6e355f280ab7c8970ff1e9072cb33e1c8fc540d799975268" exitCode=0 Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.046784 4795 generic.go:334] "Generic (PLEG): container finished" podID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerID="26bea14b0a8fdacc73844dff046507f7cfc99619978d97823b4ebec17b030e55" exitCode=143 Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.046818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerDied","Data":"91333c4383342b5e6e355f280ab7c8970ff1e9072cb33e1c8fc540d799975268"} Mar 10 15:27:00 crc kubenswrapper[4795]: I0310 15:27:00.046843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerDied","Data":"26bea14b0a8fdacc73844dff046507f7cfc99619978d97823b4ebec17b030e55"} Mar 10 15:27:01 crc kubenswrapper[4795]: I0310 15:27:01.370032 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:27:01 crc kubenswrapper[4795]: I0310 15:27:01.448169 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:27:01 crc kubenswrapper[4795]: I0310 15:27:01.448458 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="dnsmasq-dns" containerID="cri-o://22cd34a61e4a54f714608081ebd8444a7aa3a9ab9e495bb0c4e4919582395480" gracePeriod=10 Mar 10 15:27:01 crc kubenswrapper[4795]: I0310 15:27:01.470839 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 15:27:01 crc kubenswrapper[4795]: I0310 15:27:01.556030 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5869d54dfb-2wjww" podUID="379541ea-de81-488c-b6dc-2f5873fdfbeb" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.048165 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mg79m" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.092681 4795 generic.go:334] "Generic (PLEG): container finished" podID="69bec5a2-e73e-4319-95b2-093ff9223751" containerID="22cd34a61e4a54f714608081ebd8444a7aa3a9ab9e495bb0c4e4919582395480" exitCode=0 Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.092727 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" event={"ID":"69bec5a2-e73e-4319-95b2-093ff9223751","Type":"ContainerDied","Data":"22cd34a61e4a54f714608081ebd8444a7aa3a9ab9e495bb0c4e4919582395480"} Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.102502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mg79m" event={"ID":"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2","Type":"ContainerDied","Data":"290c84b4eb2f3f9f70945196eaf23a71165539683f6eaf0a6ed625dd65526e8e"} Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.102544 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="290c84b4eb2f3f9f70945196eaf23a71165539683f6eaf0a6ed625dd65526e8e" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.102614 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mg79m" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw449\" (UniqueName: \"kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175240 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175367 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.175467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data\") pod \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\" (UID: \"7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2\") " Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.180872 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.198281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.204496 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts" (OuterVolumeSpecName: "scripts") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.205598 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449" (OuterVolumeSpecName: "kube-api-access-tw449") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "kube-api-access-tw449". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.271276 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data" (OuterVolumeSpecName: "config-data") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.281144 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw449\" (UniqueName: \"kubernetes.io/projected/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-kube-api-access-tw449\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.281174 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.281184 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.281192 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.281202 4795 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.294178 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" (UID: "7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:02 crc kubenswrapper[4795]: I0310 15:27:02.382248 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.088026 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.361166 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.436025 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.458503 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.458927 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" containerName="cinder-db-sync" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.458943 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" containerName="cinder-db-sync" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.458963 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener-log" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.458972 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener-log" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.458986 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.458992 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.459004 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="dnsmasq-dns" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459009 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="dnsmasq-dns" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.459025 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="init" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459031 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="init" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459188 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener-log" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459204 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" containerName="cinder-db-sync" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459222 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" containerName="dnsmasq-dns" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.459233 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" containerName="barbican-keystone-listener" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.460109 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.467730 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-nck2z" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.467975 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.468109 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.468486 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.473543 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.510531 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511135 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n64x\" (UniqueName: \"kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511235 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511493 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data\") pod \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.511869 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle\") pod \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.512099 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom\") pod \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.515694 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs\") pod \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.515861 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb\") pod \"69bec5a2-e73e-4319-95b2-093ff9223751\" (UID: \"69bec5a2-e73e-4319-95b2-093ff9223751\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.516038 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckx8n\" (UniqueName: \"kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n\") pod \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\" (UID: \"071b61e9-efc3-4421-87b8-fab46d1c1f6e\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.518134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.518292 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.518571 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkm2n\" (UniqueName: \"kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.518746 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.518875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.519169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.525159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n" (OuterVolumeSpecName: "kube-api-access-ckx8n") pod "071b61e9-efc3-4421-87b8-fab46d1c1f6e" (UID: "071b61e9-efc3-4421-87b8-fab46d1c1f6e"). InnerVolumeSpecName "kube-api-access-ckx8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.611744 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs" (OuterVolumeSpecName: "logs") pod "071b61e9-efc3-4421-87b8-fab46d1c1f6e" (UID: "071b61e9-efc3-4421-87b8-fab46d1c1f6e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.617967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x" (OuterVolumeSpecName: "kube-api-access-5n64x") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "kube-api-access-5n64x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.629118 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data\") pod \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.629336 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom\") pod \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.629625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle\") pod \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.630274 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8t9r\" (UniqueName: \"kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r\") pod \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.630407 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs\") pod \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\" (UID: \"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a\") " Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.630769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.630880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.631003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.631196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkm2n\" (UniqueName: \"kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.631285 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.631400 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.631532 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n64x\" (UniqueName: \"kubernetes.io/projected/69bec5a2-e73e-4319-95b2-093ff9223751-kube-api-access-5n64x\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.632033 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/071b61e9-efc3-4421-87b8-fab46d1c1f6e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.632129 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckx8n\" (UniqueName: \"kubernetes.io/projected/071b61e9-efc3-4421-87b8-fab46d1c1f6e-kube-api-access-ckx8n\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.632375 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.637919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs" (OuterVolumeSpecName: "logs") pod "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" (UID: "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.649315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.650047 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "071b61e9-efc3-4421-87b8-fab46d1c1f6e" (UID: "071b61e9-efc3-4421-87b8-fab46d1c1f6e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.655409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r" (OuterVolumeSpecName: "kube-api-access-w8t9r") pod "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" (UID: "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a"). InnerVolumeSpecName "kube-api-access-w8t9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.656097 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.691462 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" (UID: "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.706692 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.708376 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.708558 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.708768 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.708787 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker" Mar 10 15:27:03 crc kubenswrapper[4795]: E0310 15:27:03.708806 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker-log" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.708813 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker-log" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.709026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker-log" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.709044 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" containerName="barbican-worker" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.710299 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.710337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.710500 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.723605 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkm2n\" (UniqueName: \"kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n\") pod \"cinder-scheduler-0\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.723709 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.736789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.738174 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.738190 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.738199 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.738208 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.738216 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8t9r\" (UniqueName: \"kubernetes.io/projected/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-kube-api-access-w8t9r\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.801659 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.834317 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.835712 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.842114 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843420 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843502 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843576 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmldx\" (UniqueName: \"kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843609 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843640 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.843667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.865496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.951588 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.951958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952051 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952475 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952570 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952631 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952668 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952702 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952784 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952789 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmldx\" (UniqueName: \"kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.952970 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cch7\" (UniqueName: \"kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.953640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.954541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.957321 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:03 crc kubenswrapper[4795]: I0310 15:27:03.978644 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmldx\" (UniqueName: \"kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx\") pod \"dnsmasq-dns-6578955fd5-wwmtp\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.040509 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057082 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cch7\" (UniqueName: \"kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057148 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057234 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057247 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057307 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.057353 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.060015 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.060371 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.063373 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.063832 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.070714 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.071262 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.076589 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "071b61e9-efc3-4421-87b8-fab46d1c1f6e" (UID: "071b61e9-efc3-4421-87b8-fab46d1c1f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.080027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cch7\" (UniqueName: \"kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7\") pod \"cinder-api-0\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.089196 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config" (OuterVolumeSpecName: "config") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.100204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" (UID: "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: E0310 15:27:04.127536 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="79684c90-dc4f-4187-a086-c0777de981e3" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.145281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.160090 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.160114 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.160124 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.160134 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.160142 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.164505 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "69bec5a2-e73e-4319-95b2-093ff9223751" (UID: "69bec5a2-e73e-4319-95b2-093ff9223751"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.171268 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.173274 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data" (OuterVolumeSpecName: "config-data") pod "071b61e9-efc3-4421-87b8-fab46d1c1f6e" (UID: "071b61e9-efc3-4421-87b8-fab46d1c1f6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.179207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data" (OuterVolumeSpecName: "config-data") pod "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" (UID: "cbfd0622-d1dd-4b73-b278-863ba3ba7d2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.187464 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-58794994d5-4hvvj" event={"ID":"cbfd0622-d1dd-4b73-b278-863ba3ba7d2a","Type":"ContainerDied","Data":"04d37e7dbe9c20d1ac141bfd430e1092895f6795e7769429e88315902787200a"} Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.187528 4795 scope.go:117] "RemoveContainer" containerID="91333c4383342b5e6e355f280ab7c8970ff1e9072cb33e1c8fc540d799975268" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.187687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-58794994d5-4hvvj" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.207449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" event={"ID":"69bec5a2-e73e-4319-95b2-093ff9223751","Type":"ContainerDied","Data":"a533556b8a747ea6c9075bde789e3a64664fd67df67761a9dc33f9e8254cd983"} Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.207548 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-k5mr9" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.243389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerStarted","Data":"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c"} Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.243588 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="ceilometer-notification-agent" containerID="cri-o://9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd" gracePeriod=30 Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.243868 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.244150 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="proxy-httpd" containerID="cri-o://3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c" gracePeriod=30 Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.244207 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="sg-core" containerID="cri-o://33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824" gracePeriod=30 Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.261768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" event={"ID":"071b61e9-efc3-4421-87b8-fab46d1c1f6e","Type":"ContainerDied","Data":"b1734d7265e3ecc08877ee065c66061e5a58bf8b6611768e3db15bf08c6ef560"} Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.261866 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68c5995f66-69w72" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.287179 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.287204 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/69bec5a2-e73e-4319-95b2-093ff9223751-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.287219 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071b61e9-efc3-4421-87b8-fab46d1c1f6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.296132 4795 scope.go:117] "RemoveContainer" containerID="26bea14b0a8fdacc73844dff046507f7cfc99619978d97823b4ebec17b030e55" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.338609 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.363800 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.382476 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-58794994d5-4hvvj"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.389170 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.398167 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-k5mr9"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.409207 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.410474 4795 scope.go:117] "RemoveContainer" containerID="22cd34a61e4a54f714608081ebd8444a7aa3a9ab9e495bb0c4e4919582395480" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.415087 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-68c5995f66-69w72"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.435291 4795 scope.go:117] "RemoveContainer" containerID="ef412769130fb84e3fc2a68f44bdd2e033909e334644a4ef206ce9ed2b61c984" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.475359 4795 scope.go:117] "RemoveContainer" containerID="e7055f6be92fed0ae9511a4f29b0e0e41242148cac775609f481ddbc4abbb7c8" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.522795 4795 scope.go:117] "RemoveContainer" containerID="c960fa9d44515a2502f8b594165f1c50c5e07e010ca622265755fbdbfccad43e" Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.660800 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:27:04 crc kubenswrapper[4795]: I0310 15:27:04.787571 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:04 crc kubenswrapper[4795]: W0310 15:27:04.794278 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc72682_89c5_4521_a83d_df982c518cbe.slice/crio-8b5a06a48d130ee0f4fabb93a70c872819d535b0020c2167bf187c5a48a97679 WatchSource:0}: Error finding container 8b5a06a48d130ee0f4fabb93a70c872819d535b0020c2167bf187c5a48a97679: Status 404 returned error can't find the container with id 8b5a06a48d130ee0f4fabb93a70c872819d535b0020c2167bf187c5a48a97679 Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.300053 4795 generic.go:334] "Generic (PLEG): container finished" podID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerID="0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275" exitCode=0 Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.300501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" event={"ID":"63613082-2a89-4b47-b33e-c1851b7b95fe","Type":"ContainerDied","Data":"0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.300536 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" event={"ID":"63613082-2a89-4b47-b33e-c1851b7b95fe","Type":"ContainerStarted","Data":"82c8d3f8e2815fa1e27bca840139d0a7fed3c07831029390919f07defe3d6a7f"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.330333 4795 generic.go:334] "Generic (PLEG): container finished" podID="79684c90-dc4f-4187-a086-c0777de981e3" containerID="3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c" exitCode=0 Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.330577 4795 generic.go:334] "Generic (PLEG): container finished" podID="79684c90-dc4f-4187-a086-c0777de981e3" containerID="33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824" exitCode=2 Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.330661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerDied","Data":"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.330694 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerDied","Data":"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.335364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerStarted","Data":"273917a54acbc4f62bcf52ccab2723bd27949053f6e9589caa8bb19aee58c58f"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.337184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerStarted","Data":"8b5a06a48d130ee0f4fabb93a70c872819d535b0020c2167bf187c5a48a97679"} Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.501210 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071b61e9-efc3-4421-87b8-fab46d1c1f6e" path="/var/lib/kubelet/pods/071b61e9-efc3-4421-87b8-fab46d1c1f6e/volumes" Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.501822 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bec5a2-e73e-4319-95b2-093ff9223751" path="/var/lib/kubelet/pods/69bec5a2-e73e-4319-95b2-093ff9223751/volumes" Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.502400 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfd0622-d1dd-4b73-b278-863ba3ba7d2a" path="/var/lib/kubelet/pods/cbfd0622-d1dd-4b73-b278-863ba3ba7d2a/volumes" Mar 10 15:27:05 crc kubenswrapper[4795]: I0310 15:27:05.856179 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.327029 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.342041 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-b68874cc4-rzntx" Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.363180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerStarted","Data":"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67"} Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.365498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerStarted","Data":"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615"} Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.378421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" event={"ID":"63613082-2a89-4b47-b33e-c1851b7b95fe","Type":"ContainerStarted","Data":"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591"} Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.378463 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.435306 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.435608 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d485ffcdb-tpksk" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api-log" containerID="cri-o://4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75" gracePeriod=30 Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.435688 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d485ffcdb-tpksk" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api" containerID="cri-o://d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702" gracePeriod=30 Mar 10 15:27:06 crc kubenswrapper[4795]: I0310 15:27:06.436038 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" podStartSLOduration=3.436029103 podStartE2EDuration="3.436029103s" podCreationTimestamp="2026-03-10 15:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:06.40517582 +0000 UTC m=+1259.570916718" watchObservedRunningTime="2026-03-10 15:27:06.436029103 +0000 UTC m=+1259.601770001" Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.388469 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerStarted","Data":"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59"} Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.405582 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerStarted","Data":"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036"} Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.405770 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api-log" containerID="cri-o://99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" gracePeriod=30 Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.406005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.406053 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api" containerID="cri-o://cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" gracePeriod=30 Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.413449 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerID="4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75" exitCode=143 Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.414413 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerDied","Data":"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75"} Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.434963 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.55833516 podStartE2EDuration="4.434940634s" podCreationTimestamp="2026-03-10 15:27:03 +0000 UTC" firstStartedPulling="2026-03-10 15:27:04.363052031 +0000 UTC m=+1257.528792929" lastFinishedPulling="2026-03-10 15:27:05.239657505 +0000 UTC m=+1258.405398403" observedRunningTime="2026-03-10 15:27:07.426326628 +0000 UTC m=+1260.592067526" watchObservedRunningTime="2026-03-10 15:27:07.434940634 +0000 UTC m=+1260.600681532" Mar 10 15:27:07 crc kubenswrapper[4795]: I0310 15:27:07.460770 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.460739702 podStartE2EDuration="4.460739702s" podCreationTimestamp="2026-03-10 15:27:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:07.448351248 +0000 UTC m=+1260.614092136" watchObservedRunningTime="2026-03-10 15:27:07.460739702 +0000 UTC m=+1260.626480600" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.020017 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062046 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062213 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cch7\" (UniqueName: \"kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062285 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062410 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062456 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs\") pod \"ecc72682-89c5-4521-a83d-df982c518cbe\" (UID: \"ecc72682-89c5-4521-a83d-df982c518cbe\") " Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062735 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.062978 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs" (OuterVolumeSpecName: "logs") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.068700 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.072045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts" (OuterVolumeSpecName: "scripts") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.073224 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7" (OuterVolumeSpecName: "kube-api-access-2cch7") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "kube-api-access-2cch7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.098253 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.125339 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data" (OuterVolumeSpecName: "config-data") pod "ecc72682-89c5-4521-a83d-df982c518cbe" (UID: "ecc72682-89c5-4521-a83d-df982c518cbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165118 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165164 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165180 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cch7\" (UniqueName: \"kubernetes.io/projected/ecc72682-89c5-4521-a83d-df982c518cbe-kube-api-access-2cch7\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165198 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ecc72682-89c5-4521-a83d-df982c518cbe-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165211 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165223 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ecc72682-89c5-4521-a83d-df982c518cbe-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.165235 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ecc72682-89c5-4521-a83d-df982c518cbe-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424230 4795 generic.go:334] "Generic (PLEG): container finished" podID="ecc72682-89c5-4521-a83d-df982c518cbe" containerID="cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" exitCode=0 Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424258 4795 generic.go:334] "Generic (PLEG): container finished" podID="ecc72682-89c5-4521-a83d-df982c518cbe" containerID="99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" exitCode=143 Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424275 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerDied","Data":"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036"} Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424352 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerDied","Data":"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615"} Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"ecc72682-89c5-4521-a83d-df982c518cbe","Type":"ContainerDied","Data":"8b5a06a48d130ee0f4fabb93a70c872819d535b0020c2167bf187c5a48a97679"} Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.424381 4795 scope.go:117] "RemoveContainer" containerID="cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.454912 4795 scope.go:117] "RemoveContainer" containerID="99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.471130 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.485944 4795 scope.go:117] "RemoveContainer" containerID="cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" Mar 10 15:27:08 crc kubenswrapper[4795]: E0310 15:27:08.486384 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036\": container with ID starting with cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036 not found: ID does not exist" containerID="cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486416 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036"} err="failed to get container status \"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036\": rpc error: code = NotFound desc = could not find container \"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036\": container with ID starting with cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036 not found: ID does not exist" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486439 4795 scope.go:117] "RemoveContainer" containerID="99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" Mar 10 15:27:08 crc kubenswrapper[4795]: E0310 15:27:08.486668 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615\": container with ID starting with 99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615 not found: ID does not exist" containerID="99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486689 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615"} err="failed to get container status \"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615\": rpc error: code = NotFound desc = could not find container \"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615\": container with ID starting with 99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615 not found: ID does not exist" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486701 4795 scope.go:117] "RemoveContainer" containerID="cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486861 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036"} err="failed to get container status \"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036\": rpc error: code = NotFound desc = could not find container \"cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036\": container with ID starting with cdb5b85e8acaf7a0995d411042c06cd4f49a4c456ea7d0e15b15f703c92d7036 not found: ID does not exist" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.486874 4795 scope.go:117] "RemoveContainer" containerID="99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.487867 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615"} err="failed to get container status \"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615\": rpc error: code = NotFound desc = could not find container \"99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615\": container with ID starting with 99322a9cf71e3761f3559a89fccd4ec2390637fa5c196d4e1351490796bb9615 not found: ID does not exist" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.496862 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.503455 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:08 crc kubenswrapper[4795]: E0310 15:27:08.503897 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.503914 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api" Mar 10 15:27:08 crc kubenswrapper[4795]: E0310 15:27:08.503931 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api-log" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.503940 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api-log" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.504163 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api-log" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.504181 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" containerName="cinder-api" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.505129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.510397 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.543123 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.543232 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.543274 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571005 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571047 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571084 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wd7q\" (UniqueName: \"kubernetes.io/projected/eaa71521-8502-4baa-a81f-7c8147ffd6a5-kube-api-access-9wd7q\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571128 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-scripts\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571149 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571188 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaa71521-8502-4baa-a81f-7c8147ffd6a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.571301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa71521-8502-4baa-a81f-7c8147ffd6a5-logs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673499 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa71521-8502-4baa-a81f-7c8147ffd6a5-logs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673632 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wd7q\" (UniqueName: \"kubernetes.io/projected/eaa71521-8502-4baa-a81f-7c8147ffd6a5-kube-api-access-9wd7q\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673684 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-scripts\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.673743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.674044 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaa71521-8502-4baa-a81f-7c8147ffd6a5-logs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.674105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaa71521-8502-4baa-a81f-7c8147ffd6a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.674618 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaa71521-8502-4baa-a81f-7c8147ffd6a5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.674674 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.677317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.677543 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.678724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.679013 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-config-data\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.679098 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.680294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaa71521-8502-4baa-a81f-7c8147ffd6a5-scripts\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.693755 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wd7q\" (UniqueName: \"kubernetes.io/projected/eaa71521-8502-4baa-a81f-7c8147ffd6a5-kube-api-access-9wd7q\") pod \"cinder-api-0\" (UID: \"eaa71521-8502-4baa-a81f-7c8147ffd6a5\") " pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.786369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 10 15:27:08 crc kubenswrapper[4795]: I0310 15:27:08.802524 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.022979 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085125 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085180 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdqbv\" (UniqueName: \"kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085395 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085420 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085473 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts\") pod \"79684c90-dc4f-4187-a086-c0777de981e3\" (UID: \"79684c90-dc4f-4187-a086-c0777de981e3\") " Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.085894 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.086168 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.090677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts" (OuterVolumeSpecName: "scripts") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.090944 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv" (OuterVolumeSpecName: "kube-api-access-rdqbv") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "kube-api-access-rdqbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.112089 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.136551 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.181214 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data" (OuterVolumeSpecName: "config-data") pod "79684c90-dc4f-4187-a086-c0777de981e3" (UID: "79684c90-dc4f-4187-a086-c0777de981e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187543 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187578 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187590 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187599 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187608 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79684c90-dc4f-4187-a086-c0777de981e3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187617 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdqbv\" (UniqueName: \"kubernetes.io/projected/79684c90-dc4f-4187-a086-c0777de981e3-kube-api-access-rdqbv\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.187627 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79684c90-dc4f-4187-a086-c0777de981e3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.255846 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.443762 4795 generic.go:334] "Generic (PLEG): container finished" podID="79684c90-dc4f-4187-a086-c0777de981e3" containerID="9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd" exitCode=0 Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.443903 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerDied","Data":"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd"} Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.443980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79684c90-dc4f-4187-a086-c0777de981e3","Type":"ContainerDied","Data":"2933eaf16e635f6b4a61564a93fb463e1ed195ae230c20f21c338a26187d3297"} Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.443871 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.444007 4795 scope.go:117] "RemoveContainer" containerID="3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.449186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaa71521-8502-4baa-a81f-7c8147ffd6a5","Type":"ContainerStarted","Data":"b001b0408dea7a245040afdfe7dfda0fff3aefc502d502791910182f4f7abc2d"} Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.496629 4795 scope.go:117] "RemoveContainer" containerID="33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.508263 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc72682-89c5-4521-a83d-df982c518cbe" path="/var/lib/kubelet/pods/ecc72682-89c5-4521-a83d-df982c518cbe/volumes" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.543346 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.546620 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.577119 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.577751 4795 scope.go:117] "RemoveContainer" containerID="9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.589405 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.589819 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="sg-core" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.589837 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="sg-core" Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.589862 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="proxy-httpd" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.589868 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="proxy-httpd" Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.589884 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="ceilometer-notification-agent" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.589890 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="ceilometer-notification-agent" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.590029 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="ceilometer-notification-agent" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.590048 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="sg-core" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.590082 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="79684c90-dc4f-4187-a086-c0777de981e3" containerName="proxy-httpd" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.591814 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.594225 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.596439 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.631337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.664820 4795 scope.go:117] "RemoveContainer" containerID="3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c" Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.669213 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c\": container with ID starting with 3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c not found: ID does not exist" containerID="3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.669268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c"} err="failed to get container status \"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c\": rpc error: code = NotFound desc = could not find container \"3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c\": container with ID starting with 3dd72027f06183cdeb7c4508b18769cc04cf87dfff3c252dd50bef1328e3461c not found: ID does not exist" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.669299 4795 scope.go:117] "RemoveContainer" containerID="33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824" Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.669689 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824\": container with ID starting with 33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824 not found: ID does not exist" containerID="33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.669725 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824"} err="failed to get container status \"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824\": rpc error: code = NotFound desc = could not find container \"33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824\": container with ID starting with 33b23babfeb27667c84c469561efd199b8f5fe5dc55e1180e5dbe6bf0c8d6824 not found: ID does not exist" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.669742 4795 scope.go:117] "RemoveContainer" containerID="9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.670284 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d485ffcdb-tpksk" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:47872->10.217.0.165:9311: read: connection reset by peer" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.670284 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d485ffcdb-tpksk" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:47878->10.217.0.165:9311: read: connection reset by peer" Mar 10 15:27:09 crc kubenswrapper[4795]: E0310 15:27:09.671827 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd\": container with ID starting with 9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd not found: ID does not exist" containerID="9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.671852 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd"} err="failed to get container status \"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd\": rpc error: code = NotFound desc = could not find container \"9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd\": container with ID starting with 9c591768f3e420adaa1b9efb5a8ad3c0cff95c3589387d1d08a24b8d13671fbd not found: ID does not exist" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.710997 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711111 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711168 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711244 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.711260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vb2q\" (UniqueName: \"kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.814523 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.814697 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.814782 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.814889 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vb2q\" (UniqueName: \"kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.815138 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.816235 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.816302 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.818515 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.818597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.822142 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.828612 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.828890 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c97bfbc45-gfdc4" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-api" containerID="cri-o://ffd79a8c9c339f8985e86f020f3ac03f340c3dce531af0d8b24567498c3ce454" gracePeriod=30 Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.830724 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5c97bfbc45-gfdc4" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" containerID="cri-o://7b8f5325d3474a9157d36c929c0d92da4e5b6b211af56ce7d8b8ec9ba7aa95be" gracePeriod=30 Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.858231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.861838 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.867255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.869045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vb2q\" (UniqueName: \"kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q\") pod \"ceilometer-0\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " pod="openstack/ceilometer-0" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.883595 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-9f8b48dd7-fxv5t"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.885095 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.911926 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f8b48dd7-fxv5t"] Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-public-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942301 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-httpd-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-ovndb-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92fcb\" (UniqueName: \"kubernetes.io/projected/a43b4908-0c2c-4d7c-8aff-1cd405684654-kube-api-access-92fcb\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942558 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942713 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-internal-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.942739 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-combined-ca-bundle\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:09 crc kubenswrapper[4795]: I0310 15:27:09.966880 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044508 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-internal-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044549 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-combined-ca-bundle\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044598 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-public-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044620 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-httpd-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-ovndb-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92fcb\" (UniqueName: \"kubernetes.io/projected/a43b4908-0c2c-4d7c-8aff-1cd405684654-kube-api-access-92fcb\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.044743 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.082957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-internal-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.083981 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-httpd-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.085713 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-public-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.086491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-combined-ca-bundle\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.091932 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-config\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.102978 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92fcb\" (UniqueName: \"kubernetes.io/projected/a43b4908-0c2c-4d7c-8aff-1cd405684654-kube-api-access-92fcb\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.104318 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b4908-0c2c-4d7c-8aff-1cd405684654-ovndb-tls-certs\") pod \"neutron-9f8b48dd7-fxv5t\" (UID: \"a43b4908-0c2c-4d7c-8aff-1cd405684654\") " pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.211910 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c97bfbc45-gfdc4" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": read tcp 10.217.0.2:52210->10.217.0.157:9696: read: connection reset by peer" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.319741 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.325947 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.450826 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwwdh\" (UniqueName: \"kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh\") pod \"ff736888-83cd-4b42-a4b3-d1e44b632972\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.450869 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data\") pod \"ff736888-83cd-4b42-a4b3-d1e44b632972\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.450900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle\") pod \"ff736888-83cd-4b42-a4b3-d1e44b632972\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.451310 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs\") pod \"ff736888-83cd-4b42-a4b3-d1e44b632972\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.451340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom\") pod \"ff736888-83cd-4b42-a4b3-d1e44b632972\" (UID: \"ff736888-83cd-4b42-a4b3-d1e44b632972\") " Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.451747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs" (OuterVolumeSpecName: "logs") pod "ff736888-83cd-4b42-a4b3-d1e44b632972" (UID: "ff736888-83cd-4b42-a4b3-d1e44b632972"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.455449 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff736888-83cd-4b42-a4b3-d1e44b632972" (UID: "ff736888-83cd-4b42-a4b3-d1e44b632972"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.455612 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh" (OuterVolumeSpecName: "kube-api-access-nwwdh") pod "ff736888-83cd-4b42-a4b3-d1e44b632972" (UID: "ff736888-83cd-4b42-a4b3-d1e44b632972"). InnerVolumeSpecName "kube-api-access-nwwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.465422 4795 generic.go:334] "Generic (PLEG): container finished" podID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerID="d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702" exitCode=0 Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.465492 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerDied","Data":"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702"} Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.466530 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d485ffcdb-tpksk" event={"ID":"ff736888-83cd-4b42-a4b3-d1e44b632972","Type":"ContainerDied","Data":"960b4e14ac584a7a84d335bde14eff76da6780799d9a61af31805219a3925acd"} Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.466712 4795 scope.go:117] "RemoveContainer" containerID="d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.466844 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d485ffcdb-tpksk" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.505920 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaa71521-8502-4baa-a81f-7c8147ffd6a5","Type":"ContainerStarted","Data":"0f97f5a04681f818b42dae56e051b695a1dd13930cf2cde454d6cf50415f4629"} Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.506239 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff736888-83cd-4b42-a4b3-d1e44b632972" (UID: "ff736888-83cd-4b42-a4b3-d1e44b632972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.507577 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data" (OuterVolumeSpecName: "config-data") pod "ff736888-83cd-4b42-a4b3-d1e44b632972" (UID: "ff736888-83cd-4b42-a4b3-d1e44b632972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.514668 4795 scope.go:117] "RemoveContainer" containerID="4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.518478 4795 generic.go:334] "Generic (PLEG): container finished" podID="382ef56c-07e8-4af1-87ee-09887564eabc" containerID="7b8f5325d3474a9157d36c929c0d92da4e5b6b211af56ce7d8b8ec9ba7aa95be" exitCode=0 Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.518527 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerDied","Data":"7b8f5325d3474a9157d36c929c0d92da4e5b6b211af56ce7d8b8ec9ba7aa95be"} Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.553587 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff736888-83cd-4b42-a4b3-d1e44b632972-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.553622 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.553635 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwwdh\" (UniqueName: \"kubernetes.io/projected/ff736888-83cd-4b42-a4b3-d1e44b632972-kube-api-access-nwwdh\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.553643 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.553654 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff736888-83cd-4b42-a4b3-d1e44b632972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.568497 4795 scope.go:117] "RemoveContainer" containerID="d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702" Mar 10 15:27:10 crc kubenswrapper[4795]: E0310 15:27:10.568945 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702\": container with ID starting with d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702 not found: ID does not exist" containerID="d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.568974 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702"} err="failed to get container status \"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702\": rpc error: code = NotFound desc = could not find container \"d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702\": container with ID starting with d739b21ed2ae7dec08bdd6978154fd89f347bc6bde03e8f2c528f3cf653c9702 not found: ID does not exist" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.569011 4795 scope.go:117] "RemoveContainer" containerID="4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75" Mar 10 15:27:10 crc kubenswrapper[4795]: E0310 15:27:10.570280 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75\": container with ID starting with 4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75 not found: ID does not exist" containerID="4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.570315 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75"} err="failed to get container status \"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75\": rpc error: code = NotFound desc = could not find container \"4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75\": container with ID starting with 4517d7810c4a5c9c362dc25786667e450feaa88bb9156751c844750498a79c75 not found: ID does not exist" Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.598649 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.808417 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.817970 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d485ffcdb-tpksk"] Mar 10 15:27:10 crc kubenswrapper[4795]: I0310 15:27:10.861235 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-9f8b48dd7-fxv5t"] Mar 10 15:27:10 crc kubenswrapper[4795]: W0310 15:27:10.870269 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda43b4908_0c2c_4d7c_8aff_1cd405684654.slice/crio-a03513737bf34eef81e61aa76c2d0f158b645b2aa0bbfbf707abd7a9552f0069 WatchSource:0}: Error finding container a03513737bf34eef81e61aa76c2d0f158b645b2aa0bbfbf707abd7a9552f0069: Status 404 returned error can't find the container with id a03513737bf34eef81e61aa76c2d0f158b645b2aa0bbfbf707abd7a9552f0069 Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.487613 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79684c90-dc4f-4187-a086-c0777de981e3" path="/var/lib/kubelet/pods/79684c90-dc4f-4187-a086-c0777de981e3/volumes" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.505375 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" path="/var/lib/kubelet/pods/ff736888-83cd-4b42-a4b3-d1e44b632972/volumes" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.538943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaa71521-8502-4baa-a81f-7c8147ffd6a5","Type":"ContainerStarted","Data":"c29faf886523403b1d37fa0bbb0ff94dc02ef243bdaf59b4e217223d840bff50"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.540099 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.554650 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8b48dd7-fxv5t" event={"ID":"a43b4908-0c2c-4d7c-8aff-1cd405684654","Type":"ContainerStarted","Data":"22a282e760520ca2e520bb22b7b65aca5821a5ebf5486936622fc54959b06f5d"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.555154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8b48dd7-fxv5t" event={"ID":"a43b4908-0c2c-4d7c-8aff-1cd405684654","Type":"ContainerStarted","Data":"669e99a9be21718737266ef183ed530fba2d2a1b6d769ae8dd54618b1719f189"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.555243 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-9f8b48dd7-fxv5t" event={"ID":"a43b4908-0c2c-4d7c-8aff-1cd405684654","Type":"ContainerStarted","Data":"a03513737bf34eef81e61aa76c2d0f158b645b2aa0bbfbf707abd7a9552f0069"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.556286 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.559836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerStarted","Data":"ddcfcf311b3e75a7bed53df2080a34c754951c97da4d0aeb0dc6e5f01217dc9f"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.559880 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerStarted","Data":"338549c44572df87a3e65d25b2e9682241ec2311467dff3d605894b28943ffd3"} Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.572830 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.572816308 podStartE2EDuration="3.572816308s" podCreationTimestamp="2026-03-10 15:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:11.57011044 +0000 UTC m=+1264.735851338" watchObservedRunningTime="2026-03-10 15:27:11.572816308 +0000 UTC m=+1264.738557196" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.597914 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-9f8b48dd7-fxv5t" podStartSLOduration=2.597895705 podStartE2EDuration="2.597895705s" podCreationTimestamp="2026-03-10 15:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:11.595551618 +0000 UTC m=+1264.761292516" watchObservedRunningTime="2026-03-10 15:27:11.597895705 +0000 UTC m=+1264.763636603" Mar 10 15:27:11 crc kubenswrapper[4795]: I0310 15:27:11.642184 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5c97bfbc45-gfdc4" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 10 15:27:12 crc kubenswrapper[4795]: I0310 15:27:12.587109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerStarted","Data":"24b45801b3a62d365af23aed3af835c69aa98b3e0745f8992fd0bf396f21c584"} Mar 10 15:27:13 crc kubenswrapper[4795]: I0310 15:27:13.606936 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerStarted","Data":"edcbcbaf212192eea93d3dbf886acd2b7e63ae57306352bb5381cc283e0965f7"} Mar 10 15:27:13 crc kubenswrapper[4795]: I0310 15:27:13.679874 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67547556b6-45876" Mar 10 15:27:13 crc kubenswrapper[4795]: I0310 15:27:13.705944 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.037944 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.043339 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.096372 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.235349 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.235708 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="dnsmasq-dns" containerID="cri-o://6d2ecbd119e1c6acab3a76364b2b12453e662b3d9712969f1b5b24f00ad5f26d" gracePeriod=10 Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.622121 4795 generic.go:334] "Generic (PLEG): container finished" podID="382ef56c-07e8-4af1-87ee-09887564eabc" containerID="ffd79a8c9c339f8985e86f020f3ac03f340c3dce531af0d8b24567498c3ce454" exitCode=0 Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.622214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerDied","Data":"ffd79a8c9c339f8985e86f020f3ac03f340c3dce531af0d8b24567498c3ce454"} Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.626974 4795 generic.go:334] "Generic (PLEG): container finished" podID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerID="6d2ecbd119e1c6acab3a76364b2b12453e662b3d9712969f1b5b24f00ad5f26d" exitCode=0 Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.627473 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="cinder-scheduler" containerID="cri-o://17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67" gracePeriod=30 Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.627860 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="probe" containerID="cri-o://4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59" gracePeriod=30 Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.627924 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" event={"ID":"32364ecb-f9b9-4abc-8491-bdf2105d0848","Type":"ContainerDied","Data":"6d2ecbd119e1c6acab3a76364b2b12453e662b3d9712969f1b5b24f00ad5f26d"} Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.898510 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.907374 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944242 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944352 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944383 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944459 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944538 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s729g\" (UniqueName: \"kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.944606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs\") pod \"382ef56c-07e8-4af1-87ee-09887564eabc\" (UID: \"382ef56c-07e8-4af1-87ee-09887564eabc\") " Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.966845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g" (OuterVolumeSpecName: "kube-api-access-s729g") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "kube-api-access-s729g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:14 crc kubenswrapper[4795]: I0310 15:27:14.980751 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.041250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048335 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048391 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048428 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcrsv\" (UniqueName: \"kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048534 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.048618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc\") pod \"32364ecb-f9b9-4abc-8491-bdf2105d0848\" (UID: \"32364ecb-f9b9-4abc-8491-bdf2105d0848\") " Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.049000 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.049016 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s729g\" (UniqueName: \"kubernetes.io/projected/382ef56c-07e8-4af1-87ee-09887564eabc-kube-api-access-s729g\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.049026 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.082060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.085466 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv" (OuterVolumeSpecName: "kube-api-access-wcrsv") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "kube-api-access-wcrsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.094250 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config" (OuterVolumeSpecName: "config") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.130316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.142629 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.149567 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.150726 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.150808 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.150866 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.150935 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.150992 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.151053 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcrsv\" (UniqueName: \"kubernetes.io/projected/32364ecb-f9b9-4abc-8491-bdf2105d0848-kube-api-access-wcrsv\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.157546 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config" (OuterVolumeSpecName: "config") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.165575 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.176186 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "382ef56c-07e8-4af1-87ee-09887564eabc" (UID: "382ef56c-07e8-4af1-87ee-09887564eabc"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.176529 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "32364ecb-f9b9-4abc-8491-bdf2105d0848" (UID: "32364ecb-f9b9-4abc-8491-bdf2105d0848"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.252381 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.252415 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.252451 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/32364ecb-f9b9-4abc-8491-bdf2105d0848-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.252460 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/382ef56c-07e8-4af1-87ee-09887564eabc-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.524592 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5869d54dfb-2wjww" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.591168 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.591400 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon-log" containerID="cri-o://2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631" gracePeriod=30 Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.591523 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" containerID="cri-o://d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0" gracePeriod=30 Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.595669 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.657820 4795 generic.go:334] "Generic (PLEG): container finished" podID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerID="4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59" exitCode=0 Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.657875 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerDied","Data":"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59"} Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.664274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5c97bfbc45-gfdc4" event={"ID":"382ef56c-07e8-4af1-87ee-09887564eabc","Type":"ContainerDied","Data":"89070222faae0f834836fee0f929b509c5216d58609663fc3db3d525722ddcdf"} Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.664361 4795 scope.go:117] "RemoveContainer" containerID="7b8f5325d3474a9157d36c929c0d92da4e5b6b211af56ce7d8b8ec9ba7aa95be" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.664305 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5c97bfbc45-gfdc4" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.667517 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.667629 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-mx8gf" event={"ID":"32364ecb-f9b9-4abc-8491-bdf2105d0848","Type":"ContainerDied","Data":"41a5a4a5b433e879c391b0b1265f875d11aa8da831128ca0ad45498baea8ce95"} Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.674726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerStarted","Data":"1bec50d4abda5db782460c5b037001cb98c30bcc2d8d2f9ec800984b3bcbdd11"} Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.674895 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.707102 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.709447 4795 scope.go:117] "RemoveContainer" containerID="ffd79a8c9c339f8985e86f020f3ac03f340c3dce531af0d8b24567498c3ce454" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.717931 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5c97bfbc45-gfdc4"] Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.727818 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.733291 4795 scope.go:117] "RemoveContainer" containerID="6d2ecbd119e1c6acab3a76364b2b12453e662b3d9712969f1b5b24f00ad5f26d" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.735232 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-mx8gf"] Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.739470 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.632542344 podStartE2EDuration="6.739453972s" podCreationTimestamp="2026-03-10 15:27:09 +0000 UTC" firstStartedPulling="2026-03-10 15:27:10.612852039 +0000 UTC m=+1263.778592947" lastFinishedPulling="2026-03-10 15:27:14.719763677 +0000 UTC m=+1267.885504575" observedRunningTime="2026-03-10 15:27:15.727057578 +0000 UTC m=+1268.892798476" watchObservedRunningTime="2026-03-10 15:27:15.739453972 +0000 UTC m=+1268.905194870" Mar 10 15:27:15 crc kubenswrapper[4795]: I0310 15:27:15.756227 4795 scope.go:117] "RemoveContainer" containerID="749a02bca4d003e97979b1fbacfc24d38895e7be51630c1662242f3992623a15" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.398982 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.438924 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-67547556b6-45876" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.458996 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.526476 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" path="/var/lib/kubelet/pods/32364ecb-f9b9-4abc-8491-bdf2105d0848/volumes" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.529363 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" path="/var/lib/kubelet/pods/382ef56c-07e8-4af1-87ee-09887564eabc/volumes" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.548488 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610309 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610458 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610486 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610545 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610625 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.610646 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkm2n\" (UniqueName: \"kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n\") pod \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\" (UID: \"37370315-b9d3-47f9-b8a4-dc1fc884ee6b\") " Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.613144 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.619047 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts" (OuterVolumeSpecName: "scripts") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.619542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n" (OuterVolumeSpecName: "kube-api-access-rkm2n") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "kube-api-access-rkm2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.620394 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.676647 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.700524 4795 generic.go:334] "Generic (PLEG): container finished" podID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerID="17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67" exitCode=0 Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.701329 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.701838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerDied","Data":"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67"} Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.701871 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"37370315-b9d3-47f9-b8a4-dc1fc884ee6b","Type":"ContainerDied","Data":"273917a54acbc4f62bcf52ccab2723bd27949053f6e9589caa8bb19aee58c58f"} Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.701892 4795 scope.go:117] "RemoveContainer" containerID="4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.713902 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data" (OuterVolumeSpecName: "config-data") pod "37370315-b9d3-47f9-b8a4-dc1fc884ee6b" (UID: "37370315-b9d3-47f9-b8a4-dc1fc884ee6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715405 4795 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715516 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkm2n\" (UniqueName: \"kubernetes.io/projected/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-kube-api-access-rkm2n\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715588 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715653 4795 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715721 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.715804 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/37370315-b9d3-47f9-b8a4-dc1fc884ee6b-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.726559 4795 scope.go:117] "RemoveContainer" containerID="17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.754487 4795 scope.go:117] "RemoveContainer" containerID="4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.755109 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59\": container with ID starting with 4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59 not found: ID does not exist" containerID="4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.755242 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59"} err="failed to get container status \"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59\": rpc error: code = NotFound desc = could not find container \"4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59\": container with ID starting with 4bd38cda67e05cd1c34008669b0b0b8b7ff77e048d7cce375063de4de78c0c59 not found: ID does not exist" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.755347 4795 scope.go:117] "RemoveContainer" containerID="17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.755810 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67\": container with ID starting with 17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67 not found: ID does not exist" containerID="17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.755933 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67"} err="failed to get container status \"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67\": rpc error: code = NotFound desc = could not find container \"17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67\": container with ID starting with 17f7be9ed96a9da63e14006f17cc90cda05e6e73ae6c1024fa8abc26e7b6aa67 not found: ID does not exist" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.794894 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56fb779756-g2577"] Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795260 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-api" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795276 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-api" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795291 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="dnsmasq-dns" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795297 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="dnsmasq-dns" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795309 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795315 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795328 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="init" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795334 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="init" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795344 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="probe" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795351 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="probe" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795381 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="cinder-scheduler" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795387 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="cinder-scheduler" Mar 10 15:27:17 crc kubenswrapper[4795]: E0310 15:27:17.795396 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api-log" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795402 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api-log" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795552 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795565 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-httpd" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795581 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="probe" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795592 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff736888-83cd-4b42-a4b3-d1e44b632972" containerName="barbican-api-log" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795600 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="382ef56c-07e8-4af1-87ee-09887564eabc" containerName="neutron-api" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795607 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="32364ecb-f9b9-4abc-8491-bdf2105d0848" containerName="dnsmasq-dns" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.795618 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" containerName="cinder-scheduler" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.796966 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.804542 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56fb779756-g2577"] Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.923953 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-internal-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924080 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvdnw\" (UniqueName: \"kubernetes.io/projected/99b89308-d3a5-4f4d-ae50-92ae07fb6941-kube-api-access-pvdnw\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924112 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b89308-d3a5-4f4d-ae50-92ae07fb6941-logs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924203 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-scripts\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924258 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-config-data\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-combined-ca-bundle\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:17 crc kubenswrapper[4795]: I0310 15:27:17.924321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-public-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.025756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-public-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.025875 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-internal-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.025947 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvdnw\" (UniqueName: \"kubernetes.io/projected/99b89308-d3a5-4f4d-ae50-92ae07fb6941-kube-api-access-pvdnw\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.025978 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b89308-d3a5-4f4d-ae50-92ae07fb6941-logs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.026032 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-scripts\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.026108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-config-data\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.026154 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-combined-ca-bundle\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.029945 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/99b89308-d3a5-4f4d-ae50-92ae07fb6941-logs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.035619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-internal-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.035898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-config-data\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.039469 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-scripts\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.040697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-public-tls-certs\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.045126 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.046892 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/99b89308-d3a5-4f4d-ae50-92ae07fb6941-combined-ca-bundle\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.052347 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvdnw\" (UniqueName: \"kubernetes.io/projected/99b89308-d3a5-4f4d-ae50-92ae07fb6941-kube-api-access-pvdnw\") pod \"placement-56fb779756-g2577\" (UID: \"99b89308-d3a5-4f4d-ae50-92ae07fb6941\") " pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.061722 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.078037 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.082297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.085032 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.099981 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.111674 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.229464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd6c655a-bae7-4234-9d8a-b585e74a75e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.229817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.229901 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gjsf\" (UniqueName: \"kubernetes.io/projected/fd6c655a-bae7-4234-9d8a-b585e74a75e6-kube-api-access-8gjsf\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.229962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.230007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.230092 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331269 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331331 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd6c655a-bae7-4234-9d8a-b585e74a75e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331348 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331410 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gjsf\" (UniqueName: \"kubernetes.io/projected/fd6c655a-bae7-4234-9d8a-b585e74a75e6-kube-api-access-8gjsf\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331451 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.331499 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.333459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd6c655a-bae7-4234-9d8a-b585e74a75e6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.337901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-scripts\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.338597 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.361977 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.363447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd6c655a-bae7-4234-9d8a-b585e74a75e6-config-data\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.365617 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gjsf\" (UniqueName: \"kubernetes.io/projected/fd6c655a-bae7-4234-9d8a-b585e74a75e6-kube-api-access-8gjsf\") pod \"cinder-scheduler-0\" (UID: \"fd6c655a-bae7-4234-9d8a-b585e74a75e6\") " pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.514658 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.565517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56fb779756-g2577"] Mar 10 15:27:18 crc kubenswrapper[4795]: W0310 15:27:18.568375 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99b89308_d3a5_4f4d_ae50_92ae07fb6941.slice/crio-41957570699a460a75cce13df9bb21ee686401c59c9a80cbba1ee9daefc4ce9f WatchSource:0}: Error finding container 41957570699a460a75cce13df9bb21ee686401c59c9a80cbba1ee9daefc4ce9f: Status 404 returned error can't find the container with id 41957570699a460a75cce13df9bb21ee686401c59c9a80cbba1ee9daefc4ce9f Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.733625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56fb779756-g2577" event={"ID":"99b89308-d3a5-4f4d-ae50-92ae07fb6941","Type":"ContainerStarted","Data":"41957570699a460a75cce13df9bb21ee686401c59c9a80cbba1ee9daefc4ce9f"} Mar 10 15:27:18 crc kubenswrapper[4795]: I0310 15:27:18.927297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 10 15:27:18 crc kubenswrapper[4795]: W0310 15:27:18.936679 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd6c655a_bae7_4234_9d8a_b585e74a75e6.slice/crio-081fc21cb785bc589d2b50fa46774963beaa107e537b452e571fe44f2b1241aa WatchSource:0}: Error finding container 081fc21cb785bc589d2b50fa46774963beaa107e537b452e571fe44f2b1241aa: Status 404 returned error can't find the container with id 081fc21cb785bc589d2b50fa46774963beaa107e537b452e571fe44f2b1241aa Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.489062 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37370315-b9d3-47f9-b8a4-dc1fc884ee6b" path="/var/lib/kubelet/pods/37370315-b9d3-47f9-b8a4-dc1fc884ee6b/volumes" Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.747176 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56fb779756-g2577" event={"ID":"99b89308-d3a5-4f4d-ae50-92ae07fb6941","Type":"ContainerStarted","Data":"a09ad760187830a4e914521e35aa2e80460b12b9770a01f8f1a1c87853af5a83"} Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.747251 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56fb779756-g2577" event={"ID":"99b89308-d3a5-4f4d-ae50-92ae07fb6941","Type":"ContainerStarted","Data":"f9bb3acd4a7af8e35d9ed03a2922fe26893f8351289ac0a7b29bb2da4a460dee"} Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.748147 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.748197 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.752855 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd6c655a-bae7-4234-9d8a-b585e74a75e6","Type":"ContainerStarted","Data":"6cd0021a7e3c7afeb63f5977566e8722d94a967ecb1265585ce98a660d708ed5"} Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.752898 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd6c655a-bae7-4234-9d8a-b585e74a75e6","Type":"ContainerStarted","Data":"081fc21cb785bc589d2b50fa46774963beaa107e537b452e571fe44f2b1241aa"} Mar 10 15:27:19 crc kubenswrapper[4795]: I0310 15:27:19.779145 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56fb779756-g2577" podStartSLOduration=2.779122487 podStartE2EDuration="2.779122487s" podCreationTimestamp="2026-03-10 15:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:19.770184061 +0000 UTC m=+1272.935924959" watchObservedRunningTime="2026-03-10 15:27:19.779122487 +0000 UTC m=+1272.944863395" Mar 10 15:27:20 crc kubenswrapper[4795]: I0310 15:27:20.768293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fd6c655a-bae7-4234-9d8a-b585e74a75e6","Type":"ContainerStarted","Data":"84d907b46459e54888776b91a656f655e5fe485e41e33a78eb896c6f32b6ee03"} Mar 10 15:27:20 crc kubenswrapper[4795]: I0310 15:27:20.794698 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.7946662140000003 podStartE2EDuration="2.794666214s" podCreationTimestamp="2026-03-10 15:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:20.785846802 +0000 UTC m=+1273.951587750" watchObservedRunningTime="2026-03-10 15:27:20.794666214 +0000 UTC m=+1273.960407142" Mar 10 15:27:20 crc kubenswrapper[4795]: I0310 15:27:20.971005 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 10 15:27:21 crc kubenswrapper[4795]: I0310 15:27:21.471175 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 15:27:21 crc kubenswrapper[4795]: I0310 15:27:21.782131 4795 generic.go:334] "Generic (PLEG): container finished" podID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerID="d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0" exitCode=0 Mar 10 15:27:21 crc kubenswrapper[4795]: I0310 15:27:21.782216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerDied","Data":"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0"} Mar 10 15:27:22 crc kubenswrapper[4795]: I0310 15:27:22.627467 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5b46885bb9-hzjqn" Mar 10 15:27:23 crc kubenswrapper[4795]: I0310 15:27:23.515656 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.154432 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.157032 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.159408 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-mq84r" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.159910 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.163767 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.173455 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.173497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7vj\" (UniqueName: \"kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.173575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.173630 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.177167 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.275224 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.275298 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.275323 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7vj\" (UniqueName: \"kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.275388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.276206 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.289495 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.292375 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7vj\" (UniqueName: \"kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.295414 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret\") pod \"openstackclient\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.461780 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.462427 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.518863 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.518903 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.519840 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.519915 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.580174 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.580248 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config-secret\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.580278 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.580315 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjpfh\" (UniqueName: \"kubernetes.io/projected/64902640-6d88-46eb-98f0-475f8f976aaa-kube-api-access-wjpfh\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: E0310 15:27:25.597154 4795 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 10 15:27:25 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c91ac2bd-a479-469e-8c55-c0df64a4faec_0(58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66" Netns:"/var/run/netns/4078dbbc-1c94-48f9-87bf-8a88cc530525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66;K8S_POD_UID=c91ac2bd-a479-469e-8c55-c0df64a4faec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c91ac2bd-a479-469e-8c55-c0df64a4faec]: expected pod UID "c91ac2bd-a479-469e-8c55-c0df64a4faec" but got "64902640-6d88-46eb-98f0-475f8f976aaa" from Kube API Mar 10 15:27:25 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:27:25 crc kubenswrapper[4795]: > Mar 10 15:27:25 crc kubenswrapper[4795]: E0310 15:27:25.597227 4795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 10 15:27:25 crc kubenswrapper[4795]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c91ac2bd-a479-469e-8c55-c0df64a4faec_0(58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66" Netns:"/var/run/netns/4078dbbc-1c94-48f9-87bf-8a88cc530525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=58b78f417bb80234f12e647741a03708ed3a272375937ba72f50235b187fbc66;K8S_POD_UID=c91ac2bd-a479-469e-8c55-c0df64a4faec" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c91ac2bd-a479-469e-8c55-c0df64a4faec]: expected pod UID "c91ac2bd-a479-469e-8c55-c0df64a4faec" but got "64902640-6d88-46eb-98f0-475f8f976aaa" from Kube API Mar 10 15:27:25 crc kubenswrapper[4795]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 10 15:27:25 crc kubenswrapper[4795]: > pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.681888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjpfh\" (UniqueName: \"kubernetes.io/projected/64902640-6d88-46eb-98f0-475f8f976aaa-kube-api-access-wjpfh\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.682037 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.682078 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config-secret\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.682109 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.683020 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.687592 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-openstack-config-secret\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.696022 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64902640-6d88-46eb-98f0-475f8f976aaa-combined-ca-bundle\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.698182 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjpfh\" (UniqueName: \"kubernetes.io/projected/64902640-6d88-46eb-98f0-475f8f976aaa-kube-api-access-wjpfh\") pod \"openstackclient\" (UID: \"64902640-6d88-46eb-98f0-475f8f976aaa\") " pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.853632 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.857509 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c91ac2bd-a479-469e-8c55-c0df64a4faec" podUID="64902640-6d88-46eb-98f0-475f8f976aaa" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.861747 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.885463 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config\") pod \"c91ac2bd-a479-469e-8c55-c0df64a4faec\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.885735 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret\") pod \"c91ac2bd-a479-469e-8c55-c0df64a4faec\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.885919 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7vj\" (UniqueName: \"kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj\") pod \"c91ac2bd-a479-469e-8c55-c0df64a4faec\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.885952 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c91ac2bd-a479-469e-8c55-c0df64a4faec" (UID: "c91ac2bd-a479-469e-8c55-c0df64a4faec"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.886061 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle\") pod \"c91ac2bd-a479-469e-8c55-c0df64a4faec\" (UID: \"c91ac2bd-a479-469e-8c55-c0df64a4faec\") " Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.886825 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.889479 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c91ac2bd-a479-469e-8c55-c0df64a4faec" (UID: "c91ac2bd-a479-469e-8c55-c0df64a4faec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.891144 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj" (OuterVolumeSpecName: "kube-api-access-dw7vj") pod "c91ac2bd-a479-469e-8c55-c0df64a4faec" (UID: "c91ac2bd-a479-469e-8c55-c0df64a4faec"). InnerVolumeSpecName "kube-api-access-dw7vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.891163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c91ac2bd-a479-469e-8c55-c0df64a4faec" (UID: "c91ac2bd-a479-469e-8c55-c0df64a4faec"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.961036 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.988741 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.988784 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7vj\" (UniqueName: \"kubernetes.io/projected/c91ac2bd-a479-469e-8c55-c0df64a4faec-kube-api-access-dw7vj\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:25 crc kubenswrapper[4795]: I0310 15:27:25.988799 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c91ac2bd-a479-469e-8c55-c0df64a4faec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:26 crc kubenswrapper[4795]: I0310 15:27:26.430621 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 10 15:27:26 crc kubenswrapper[4795]: I0310 15:27:26.866383 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 10 15:27:26 crc kubenswrapper[4795]: I0310 15:27:26.868430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"64902640-6d88-46eb-98f0-475f8f976aaa","Type":"ContainerStarted","Data":"9558a2703497dc6c292ca70376737dfcaefdc1c514a582fcc2567fd60c015aef"} Mar 10 15:27:26 crc kubenswrapper[4795]: I0310 15:27:26.871350 4795 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c91ac2bd-a479-469e-8c55-c0df64a4faec" podUID="64902640-6d88-46eb-98f0-475f8f976aaa" Mar 10 15:27:27 crc kubenswrapper[4795]: I0310 15:27:27.508010 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91ac2bd-a479-469e-8c55-c0df64a4faec" path="/var/lib/kubelet/pods/c91ac2bd-a479-469e-8c55-c0df64a4faec/volumes" Mar 10 15:27:28 crc kubenswrapper[4795]: I0310 15:27:28.710475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.452463 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.452745 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-central-agent" containerID="cri-o://ddcfcf311b3e75a7bed53df2080a34c754951c97da4d0aeb0dc6e5f01217dc9f" gracePeriod=30 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.453457 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="proxy-httpd" containerID="cri-o://1bec50d4abda5db782460c5b037001cb98c30bcc2d8d2f9ec800984b3bcbdd11" gracePeriod=30 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.453508 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="sg-core" containerID="cri-o://edcbcbaf212192eea93d3dbf886acd2b7e63ae57306352bb5381cc283e0965f7" gracePeriod=30 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.453541 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-notification-agent" containerID="cri-o://24b45801b3a62d365af23aed3af835c69aa98b3e0745f8992fd0bf396f21c584" gracePeriod=30 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.462755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.668639 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6b868dfb95-x8b6r"] Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.681335 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.686214 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.686665 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.689000 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.730197 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6b868dfb95-x8b6r"] Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776291 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-log-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776349 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7whw\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-kube-api-access-n7whw\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-internal-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-public-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-etc-swift\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776657 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-run-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776749 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-config-data\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.776899 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-combined-ca-bundle\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-combined-ca-bundle\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879131 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-log-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879196 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7whw\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-kube-api-access-n7whw\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879238 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-internal-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-public-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-etc-swift\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879325 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-run-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879370 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-config-data\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.879912 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-log-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.885134 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-run-httpd\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.885726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-combined-ca-bundle\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.886368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-config-data\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.886888 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-internal-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.887854 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-etc-swift\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.893578 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-public-tls-certs\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.901658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7whw\" (UniqueName: \"kubernetes.io/projected/62ae38d2-b7a6-4c50-8506-dc3c18a89fd1-kube-api-access-n7whw\") pod \"swift-proxy-6b868dfb95-x8b6r\" (UID: \"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1\") " pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903732 4795 generic.go:334] "Generic (PLEG): container finished" podID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerID="1bec50d4abda5db782460c5b037001cb98c30bcc2d8d2f9ec800984b3bcbdd11" exitCode=0 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903769 4795 generic.go:334] "Generic (PLEG): container finished" podID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerID="edcbcbaf212192eea93d3dbf886acd2b7e63ae57306352bb5381cc283e0965f7" exitCode=2 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903779 4795 generic.go:334] "Generic (PLEG): container finished" podID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerID="24b45801b3a62d365af23aed3af835c69aa98b3e0745f8992fd0bf396f21c584" exitCode=0 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903788 4795 generic.go:334] "Generic (PLEG): container finished" podID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerID="ddcfcf311b3e75a7bed53df2080a34c754951c97da4d0aeb0dc6e5f01217dc9f" exitCode=0 Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerDied","Data":"1bec50d4abda5db782460c5b037001cb98c30bcc2d8d2f9ec800984b3bcbdd11"} Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerDied","Data":"edcbcbaf212192eea93d3dbf886acd2b7e63ae57306352bb5381cc283e0965f7"} Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903849 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerDied","Data":"24b45801b3a62d365af23aed3af835c69aa98b3e0745f8992fd0bf396f21c584"} Mar 10 15:27:29 crc kubenswrapper[4795]: I0310 15:27:29.903860 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerDied","Data":"ddcfcf311b3e75a7bed53df2080a34c754951c97da4d0aeb0dc6e5f01217dc9f"} Mar 10 15:27:30 crc kubenswrapper[4795]: I0310 15:27:30.048825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:31 crc kubenswrapper[4795]: I0310 15:27:31.471282 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.711549 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-7t4xc"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.713120 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.745337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7t4xc"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.790818 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.790902 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lld92\" (UniqueName: \"kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.859344 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-nmqx6"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.860610 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.875871 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nmqx6"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.895364 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.896828 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.897993 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lld92\" (UniqueName: \"kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.922768 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lld92\" (UniqueName: \"kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92\") pod \"nova-api-db-create-7t4xc\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.959472 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ada9ed25-edcf-4a12-89ed-5a195477f439","Type":"ContainerDied","Data":"338549c44572df87a3e65d25b2e9682241ec2311467dff3d605894b28943ffd3"} Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.959509 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="338549c44572df87a3e65d25b2e9682241ec2311467dff3d605894b28943ffd3" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.967157 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-e8a3-account-create-update-lzpjf"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.968378 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.969764 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e8a3-account-create-update-lzpjf"] Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.974407 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.999600 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtjk8\" (UniqueName: \"kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:34 crc kubenswrapper[4795]: I0310 15:27:34.999946 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.000037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqjls\" (UniqueName: \"kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.000133 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.013110 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9gzkw"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.014392 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.037825 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.047770 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.053129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9gzkw"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.104310 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqjls\" (UniqueName: \"kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.104409 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.104590 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtjk8\" (UniqueName: \"kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.104635 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.105527 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.107409 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.143992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqjls\" (UniqueName: \"kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls\") pod \"nova-api-e8a3-account-create-update-lzpjf\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.145124 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtjk8\" (UniqueName: \"kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8\") pod \"nova-cell0-db-create-nmqx6\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.153945 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ede0-account-create-update-s67gw"] Mar 10 15:27:35 crc kubenswrapper[4795]: E0310 15:27:35.154379 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-central-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154397 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-central-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: E0310 15:27:35.154405 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="sg-core" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154415 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="sg-core" Mar 10 15:27:35 crc kubenswrapper[4795]: E0310 15:27:35.154449 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="proxy-httpd" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154455 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="proxy-httpd" Mar 10 15:27:35 crc kubenswrapper[4795]: E0310 15:27:35.154465 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-notification-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154471 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-notification-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154627 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-central-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154642 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="sg-core" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154655 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="proxy-httpd" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.154674 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" containerName="ceilometer-notification-agent" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.155258 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.157981 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.164889 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ede0-account-create-update-s67gw"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vb2q\" (UniqueName: \"kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206483 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206561 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206700 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206717 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206778 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.206818 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml\") pod \"ada9ed25-edcf-4a12-89ed-5a195477f439\" (UID: \"ada9ed25-edcf-4a12-89ed-5a195477f439\") " Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.207465 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.208615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.209095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.209246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtt9b\" (UniqueName: \"kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.210152 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.210181 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ada9ed25-edcf-4a12-89ed-5a195477f439-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.229118 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts" (OuterVolumeSpecName: "scripts") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.232631 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q" (OuterVolumeSpecName: "kube-api-access-6vb2q") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "kube-api-access-6vb2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.256825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtt9b\" (UniqueName: \"kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316624 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316663 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxmj\" (UniqueName: \"kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316738 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316750 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vb2q\" (UniqueName: \"kubernetes.io/projected/ada9ed25-edcf-4a12-89ed-5a195477f439-kube-api-access-6vb2q\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.316760 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.317398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.322989 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.333224 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-4eb7-account-create-update-77mdf"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.334297 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.336057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.337742 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtt9b\" (UniqueName: \"kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b\") pod \"nova-cell1-db-create-9gzkw\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.350354 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4eb7-account-create-update-77mdf"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.350487 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.372660 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data" (OuterVolumeSpecName: "config-data") pod "ada9ed25-edcf-4a12-89ed-5a195477f439" (UID: "ada9ed25-edcf-4a12-89ed-5a195477f439"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.375459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.401278 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.420894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxmj\" (UniqueName: \"kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.421150 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.431839 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.433379 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ada9ed25-edcf-4a12-89ed-5a195477f439-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.440370 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.451493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxmj\" (UniqueName: \"kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj\") pod \"nova-cell0-ede0-account-create-update-s67gw\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.495481 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.535512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwg2j\" (UniqueName: \"kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.535871 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.602382 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6b868dfb95-x8b6r"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.637352 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-7t4xc"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.650641 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.651948 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwg2j\" (UniqueName: \"kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.653557 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.669528 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwg2j\" (UniqueName: \"kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j\") pod \"nova-cell1-4eb7-account-create-update-77mdf\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.952059 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-nmqx6"] Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.962969 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.973430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b868dfb95-x8b6r" event={"ID":"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1","Type":"ContainerStarted","Data":"47b2d268c6a3782aba5f991881edcaf20eb15142691d2a4df853d0b54e85507e"} Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.973486 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b868dfb95-x8b6r" event={"ID":"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1","Type":"ContainerStarted","Data":"ff7d8ab9d6e0b63597275d93cea416b98df6f45b09fb2dd732a9df96cdb8018b"} Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.979165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nmqx6" event={"ID":"1de929ea-a6bc-48b9-9254-f0eaa6a73f36","Type":"ContainerStarted","Data":"1ea0c78df2f8c19ed1d735f99a6fc34d3979fb28fc4357e6f4c63c83c256b504"} Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.983743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"64902640-6d88-46eb-98f0-475f8f976aaa","Type":"ContainerStarted","Data":"7c244634af5a61f2478d5af14c352d91c61d81307636a5ad3079ba94b9558797"} Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.991198 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.992382 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7t4xc" event={"ID":"2a4dce0f-e1c8-434c-b513-0f6f86d89099","Type":"ContainerStarted","Data":"c466bef69580a6a3e3583f4e350912746cd8d3dccfadd079be646071b86b4f24"} Mar 10 15:27:35 crc kubenswrapper[4795]: I0310 15:27:35.992410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7t4xc" event={"ID":"2a4dce0f-e1c8-434c-b513-0f6f86d89099","Type":"ContainerStarted","Data":"438e93d479d38c5c550fe5f387d9701e60dd91f7ec817bbc96d02f246c4cac81"} Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.030188 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.514619636 podStartE2EDuration="11.030170799s" podCreationTimestamp="2026-03-10 15:27:25 +0000 UTC" firstStartedPulling="2026-03-10 15:27:26.425415062 +0000 UTC m=+1279.591155960" lastFinishedPulling="2026-03-10 15:27:34.940966225 +0000 UTC m=+1288.106707123" observedRunningTime="2026-03-10 15:27:36.004710221 +0000 UTC m=+1289.170451119" watchObservedRunningTime="2026-03-10 15:27:36.030170799 +0000 UTC m=+1289.195911697" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.037817 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-e8a3-account-create-update-lzpjf"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.062145 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-7t4xc" podStartSLOduration=2.062125562 podStartE2EDuration="2.062125562s" podCreationTimestamp="2026-03-10 15:27:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:36.02323188 +0000 UTC m=+1289.188972778" watchObservedRunningTime="2026-03-10 15:27:36.062125562 +0000 UTC m=+1289.227866460" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.158304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9gzkw"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.166376 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ede0-account-create-update-s67gw"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.192217 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.229456 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.255782 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.258057 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.260539 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.260805 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.267551 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.268586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.268624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.268730 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.269397 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.269447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.269496 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.269532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxhcp\" (UniqueName: \"kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.372769 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.372834 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.372886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.372912 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxhcp\" (UniqueName: \"kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.372977 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.373011 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.373162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.374329 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.376737 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.379885 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.379903 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.382378 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.384657 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.396232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxhcp\" (UniqueName: \"kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp\") pod \"ceilometer-0\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " pod="openstack/ceilometer-0" Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.498983 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-4eb7-account-create-update-77mdf"] Mar 10 15:27:36 crc kubenswrapper[4795]: I0310 15:27:36.577382 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.008344 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.019015 4795 generic.go:334] "Generic (PLEG): container finished" podID="e3002df7-165d-4f82-9e02-4d48bf960c87" containerID="3a5b205df7ef38a505f95a2d8756c9605b99541bcd1f12386901c56b5ff70611" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.019102 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" event={"ID":"e3002df7-165d-4f82-9e02-4d48bf960c87","Type":"ContainerDied","Data":"3a5b205df7ef38a505f95a2d8756c9605b99541bcd1f12386901c56b5ff70611"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.019131 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" event={"ID":"e3002df7-165d-4f82-9e02-4d48bf960c87","Type":"ContainerStarted","Data":"fb7b7751a1cf340d6406ca22f49d6df8498cce3ebbc7ada8c8d181aa2cb36d0a"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.022235 4795 generic.go:334] "Generic (PLEG): container finished" podID="5678c7de-b0fe-4d07-a752-1cd7eea46db6" containerID="442bea24fb4a81cd3787c64bf697eadca18160e3b23da0874fdb9689e66848b2" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.022302 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" event={"ID":"5678c7de-b0fe-4d07-a752-1cd7eea46db6","Type":"ContainerDied","Data":"442bea24fb4a81cd3787c64bf697eadca18160e3b23da0874fdb9689e66848b2"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.022326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" event={"ID":"5678c7de-b0fe-4d07-a752-1cd7eea46db6","Type":"ContainerStarted","Data":"7fe7f888591c1d3aff0957610cd3d68910459f5c1ce6c466de9a06784cd04141"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.032634 4795 generic.go:334] "Generic (PLEG): container finished" podID="2a4dce0f-e1c8-434c-b513-0f6f86d89099" containerID="c466bef69580a6a3e3583f4e350912746cd8d3dccfadd079be646071b86b4f24" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.032730 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7t4xc" event={"ID":"2a4dce0f-e1c8-434c-b513-0f6f86d89099","Type":"ContainerDied","Data":"c466bef69580a6a3e3583f4e350912746cd8d3dccfadd079be646071b86b4f24"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.042495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6b868dfb95-x8b6r" event={"ID":"62ae38d2-b7a6-4c50-8506-dc3c18a89fd1","Type":"ContainerStarted","Data":"2158dc7ea3cddd5be90c85ddf271efb59d9dab1ea05fbbda200148a89bab1682"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.043220 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.043334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.053044 4795 generic.go:334] "Generic (PLEG): container finished" podID="1de929ea-a6bc-48b9-9254-f0eaa6a73f36" containerID="c38a22ec55758969bb09dc77f11e0196b3242889a6387f045d0477d331c266d0" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.053186 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nmqx6" event={"ID":"1de929ea-a6bc-48b9-9254-f0eaa6a73f36","Type":"ContainerDied","Data":"c38a22ec55758969bb09dc77f11e0196b3242889a6387f045d0477d331c266d0"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.071428 4795 generic.go:334] "Generic (PLEG): container finished" podID="82d29f02-6127-4a6a-8df2-b541ce3ee733" containerID="9ba9cb0a8abffd4cae87b0d21779ed81e17831e82002891cd86da8d3d80b74cb" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.071562 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" event={"ID":"82d29f02-6127-4a6a-8df2-b541ce3ee733","Type":"ContainerDied","Data":"9ba9cb0a8abffd4cae87b0d21779ed81e17831e82002891cd86da8d3d80b74cb"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.071589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" event={"ID":"82d29f02-6127-4a6a-8df2-b541ce3ee733","Type":"ContainerStarted","Data":"d6b0c32f76b5ec0d0fe6dc7bf9565236d024a1ae87cbda5c522b4dacbcee50cd"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.074877 4795 generic.go:334] "Generic (PLEG): container finished" podID="8412ff64-8bdb-4af0-a844-997ad007635e" containerID="aa96d020ad3adf8993443396071d816ecf85b246fd51593f4c37ce1959b38d13" exitCode=0 Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.074910 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9gzkw" event={"ID":"8412ff64-8bdb-4af0-a844-997ad007635e","Type":"ContainerDied","Data":"aa96d020ad3adf8993443396071d816ecf85b246fd51593f4c37ce1959b38d13"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.074938 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9gzkw" event={"ID":"8412ff64-8bdb-4af0-a844-997ad007635e","Type":"ContainerStarted","Data":"16aa7715b0411832b52e06f658b69b9adf68536c65fd847096af58c79d160a70"} Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.153125 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6b868dfb95-x8b6r" podStartSLOduration=8.153106866 podStartE2EDuration="8.153106866s" podCreationTimestamp="2026-03-10 15:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:37.097038133 +0000 UTC m=+1290.262779031" watchObservedRunningTime="2026-03-10 15:27:37.153106866 +0000 UTC m=+1290.318847764" Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.163281 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:37 crc kubenswrapper[4795]: I0310 15:27:37.489343 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ada9ed25-edcf-4a12-89ed-5a195477f439" path="/var/lib/kubelet/pods/ada9ed25-edcf-4a12-89ed-5a195477f439/volumes" Mar 10 15:27:38 crc kubenswrapper[4795]: I0310 15:27:38.084140 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerStarted","Data":"4c876561ce0d295cfdb5f47dacfaa26915f93a73c5f0d0f770888a2c83e5048e"} Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.051617 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.127777 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerStarted","Data":"e564d3c4690955d1224e2817c97e7ed9adc93e938bb0a4973b1823bbfc60e183"} Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.134129 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" event={"ID":"e3002df7-165d-4f82-9e02-4d48bf960c87","Type":"ContainerDied","Data":"fb7b7751a1cf340d6406ca22f49d6df8498cce3ebbc7ada8c8d181aa2cb36d0a"} Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.134169 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb7b7751a1cf340d6406ca22f49d6df8498cce3ebbc7ada8c8d181aa2cb36d0a" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.134230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-4eb7-account-create-update-77mdf" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.196475 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwg2j\" (UniqueName: \"kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j\") pod \"e3002df7-165d-4f82-9e02-4d48bf960c87\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.196646 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts\") pod \"e3002df7-165d-4f82-9e02-4d48bf960c87\" (UID: \"e3002df7-165d-4f82-9e02-4d48bf960c87\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.197398 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3002df7-165d-4f82-9e02-4d48bf960c87" (UID: "e3002df7-165d-4f82-9e02-4d48bf960c87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.200415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j" (OuterVolumeSpecName: "kube-api-access-xwg2j") pod "e3002df7-165d-4f82-9e02-4d48bf960c87" (UID: "e3002df7-165d-4f82-9e02-4d48bf960c87"). InnerVolumeSpecName "kube-api-access-xwg2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.272039 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.276542 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.282433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.292563 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.297075 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.298428 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwg2j\" (UniqueName: \"kubernetes.io/projected/e3002df7-165d-4f82-9e02-4d48bf960c87-kube-api-access-xwg2j\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.298505 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3002df7-165d-4f82-9e02-4d48bf960c87-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.399859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts\") pod \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.399918 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtjk8\" (UniqueName: \"kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8\") pod \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\" (UID: \"1de929ea-a6bc-48b9-9254-f0eaa6a73f36\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.399968 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts\") pod \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.399996 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts\") pod \"8412ff64-8bdb-4af0-a844-997ad007635e\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400034 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts\") pod \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400134 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts\") pod \"82d29f02-6127-4a6a-8df2-b541ce3ee733\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lld92\" (UniqueName: \"kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92\") pod \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\" (UID: \"2a4dce0f-e1c8-434c-b513-0f6f86d89099\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtt9b\" (UniqueName: \"kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b\") pod \"8412ff64-8bdb-4af0-a844-997ad007635e\" (UID: \"8412ff64-8bdb-4af0-a844-997ad007635e\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400307 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqjls\" (UniqueName: \"kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls\") pod \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\" (UID: \"5678c7de-b0fe-4d07-a752-1cd7eea46db6\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400333 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxmj\" (UniqueName: \"kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj\") pod \"82d29f02-6127-4a6a-8df2-b541ce3ee733\" (UID: \"82d29f02-6127-4a6a-8df2-b541ce3ee733\") " Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400435 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8412ff64-8bdb-4af0-a844-997ad007635e" (UID: "8412ff64-8bdb-4af0-a844-997ad007635e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400473 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a4dce0f-e1c8-434c-b513-0f6f86d89099" (UID: "2a4dce0f-e1c8-434c-b513-0f6f86d89099"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400542 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5678c7de-b0fe-4d07-a752-1cd7eea46db6" (UID: "5678c7de-b0fe-4d07-a752-1cd7eea46db6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400863 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a4dce0f-e1c8-434c-b513-0f6f86d89099-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400891 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8412ff64-8bdb-4af0-a844-997ad007635e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400893 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82d29f02-6127-4a6a-8df2-b541ce3ee733" (UID: "82d29f02-6127-4a6a-8df2-b541ce3ee733"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400904 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5678c7de-b0fe-4d07-a752-1cd7eea46db6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.400947 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1de929ea-a6bc-48b9-9254-f0eaa6a73f36" (UID: "1de929ea-a6bc-48b9-9254-f0eaa6a73f36"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.405410 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj" (OuterVolumeSpecName: "kube-api-access-ctxmj") pod "82d29f02-6127-4a6a-8df2-b541ce3ee733" (UID: "82d29f02-6127-4a6a-8df2-b541ce3ee733"). InnerVolumeSpecName "kube-api-access-ctxmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.405434 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8" (OuterVolumeSpecName: "kube-api-access-xtjk8") pod "1de929ea-a6bc-48b9-9254-f0eaa6a73f36" (UID: "1de929ea-a6bc-48b9-9254-f0eaa6a73f36"). InnerVolumeSpecName "kube-api-access-xtjk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.405420 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92" (OuterVolumeSpecName: "kube-api-access-lld92") pod "2a4dce0f-e1c8-434c-b513-0f6f86d89099" (UID: "2a4dce0f-e1c8-434c-b513-0f6f86d89099"). InnerVolumeSpecName "kube-api-access-lld92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.405577 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b" (OuterVolumeSpecName: "kube-api-access-xtt9b") pod "8412ff64-8bdb-4af0-a844-997ad007635e" (UID: "8412ff64-8bdb-4af0-a844-997ad007635e"). InnerVolumeSpecName "kube-api-access-xtt9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.405854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls" (OuterVolumeSpecName: "kube-api-access-cqjls") pod "5678c7de-b0fe-4d07-a752-1cd7eea46db6" (UID: "5678c7de-b0fe-4d07-a752-1cd7eea46db6"). InnerVolumeSpecName "kube-api-access-cqjls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502352 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82d29f02-6127-4a6a-8df2-b541ce3ee733-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502386 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lld92\" (UniqueName: \"kubernetes.io/projected/2a4dce0f-e1c8-434c-b513-0f6f86d89099-kube-api-access-lld92\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502401 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtt9b\" (UniqueName: \"kubernetes.io/projected/8412ff64-8bdb-4af0-a844-997ad007635e-kube-api-access-xtt9b\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502412 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqjls\" (UniqueName: \"kubernetes.io/projected/5678c7de-b0fe-4d07-a752-1cd7eea46db6-kube-api-access-cqjls\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502422 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctxmj\" (UniqueName: \"kubernetes.io/projected/82d29f02-6127-4a6a-8df2-b541ce3ee733-kube-api-access-ctxmj\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502433 4795 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:39 crc kubenswrapper[4795]: I0310 15:27:39.502443 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtjk8\" (UniqueName: \"kubernetes.io/projected/1de929ea-a6bc-48b9-9254-f0eaa6a73f36-kube-api-access-xtjk8\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.060836 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.147245 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-nmqx6" event={"ID":"1de929ea-a6bc-48b9-9254-f0eaa6a73f36","Type":"ContainerDied","Data":"1ea0c78df2f8c19ed1d735f99a6fc34d3979fb28fc4357e6f4c63c83c256b504"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.147282 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea0c78df2f8c19ed1d735f99a6fc34d3979fb28fc4357e6f4c63c83c256b504" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.147312 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-nmqx6" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.149152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" event={"ID":"82d29f02-6127-4a6a-8df2-b541ce3ee733","Type":"ContainerDied","Data":"d6b0c32f76b5ec0d0fe6dc7bf9565236d024a1ae87cbda5c522b4dacbcee50cd"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.149172 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6b0c32f76b5ec0d0fe6dc7bf9565236d024a1ae87cbda5c522b4dacbcee50cd" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.149233 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ede0-account-create-update-s67gw" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.151668 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9gzkw" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.151687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9gzkw" event={"ID":"8412ff64-8bdb-4af0-a844-997ad007635e","Type":"ContainerDied","Data":"16aa7715b0411832b52e06f658b69b9adf68536c65fd847096af58c79d160a70"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.151741 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16aa7715b0411832b52e06f658b69b9adf68536c65fd847096af58c79d160a70" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.154959 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerStarted","Data":"8dfaf95a6d854141cbcf95ff530d4ee917d61f933da74a4a5e1c63984fd2ca8e"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.154990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerStarted","Data":"2e9ca0c98eb5dfd3abd40faeedff900feceeba0be6380c6b7e2fa6464ee59359"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.156704 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" event={"ID":"5678c7de-b0fe-4d07-a752-1cd7eea46db6","Type":"ContainerDied","Data":"7fe7f888591c1d3aff0957610cd3d68910459f5c1ce6c466de9a06784cd04141"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.156747 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fe7f888591c1d3aff0957610cd3d68910459f5c1ce6c466de9a06784cd04141" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.157189 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-e8a3-account-create-update-lzpjf" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.158257 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-7t4xc" event={"ID":"2a4dce0f-e1c8-434c-b513-0f6f86d89099","Type":"ContainerDied","Data":"438e93d479d38c5c550fe5f387d9701e60dd91f7ec817bbc96d02f246c4cac81"} Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.158297 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="438e93d479d38c5c550fe5f387d9701e60dd91f7ec817bbc96d02f246c4cac81" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.158325 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-7t4xc" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.441628 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-9f8b48dd7-fxv5t" Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.523408 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.523620 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7968b684f6-5dwff" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-api" containerID="cri-o://31f007b4e6d09cb48e4de2181e5984486844b6e2026abbca00e17b4e06a47d6b" gracePeriod=30 Mar 10 15:27:40 crc kubenswrapper[4795]: I0310 15:27:40.524224 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7968b684f6-5dwff" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-httpd" containerID="cri-o://546cd9fc01e7eb5fb85f6cec9477bb99c629b460a781f59f1023b01c37c4fc90" gracePeriod=30 Mar 10 15:27:41 crc kubenswrapper[4795]: I0310 15:27:41.168380 4795 generic.go:334] "Generic (PLEG): container finished" podID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerID="546cd9fc01e7eb5fb85f6cec9477bb99c629b460a781f59f1023b01c37c4fc90" exitCode=0 Mar 10 15:27:41 crc kubenswrapper[4795]: I0310 15:27:41.168437 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerDied","Data":"546cd9fc01e7eb5fb85f6cec9477bb99c629b460a781f59f1023b01c37c4fc90"} Mar 10 15:27:41 crc kubenswrapper[4795]: I0310 15:27:41.471313 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67547556b6-45876" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Mar 10 15:27:41 crc kubenswrapper[4795]: I0310 15:27:41.471451 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67547556b6-45876" Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.055814 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.056352 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-log" containerID="cri-o://f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.056448 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-httpd" containerID="cri-o://ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181342 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerStarted","Data":"5ca3b7c82ea996343a01c4d58acb11af8be186c5c35b7cfab528d88eb9254317"} Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181472 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-central-agent" containerID="cri-o://e564d3c4690955d1224e2817c97e7ed9adc93e938bb0a4973b1823bbfc60e183" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181531 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="proxy-httpd" containerID="cri-o://5ca3b7c82ea996343a01c4d58acb11af8be186c5c35b7cfab528d88eb9254317" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181546 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="sg-core" containerID="cri-o://8dfaf95a6d854141cbcf95ff530d4ee917d61f933da74a4a5e1c63984fd2ca8e" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181582 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-notification-agent" containerID="cri-o://2e9ca0c98eb5dfd3abd40faeedff900feceeba0be6380c6b7e2fa6464ee59359" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.181664 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.186140 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerID="f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058" exitCode=143 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.186192 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerDied","Data":"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058"} Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.222221 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.887609557 podStartE2EDuration="6.222203485s" podCreationTimestamp="2026-03-10 15:27:36 +0000 UTC" firstStartedPulling="2026-03-10 15:27:37.139323902 +0000 UTC m=+1290.305064800" lastFinishedPulling="2026-03-10 15:27:41.47391784 +0000 UTC m=+1294.639658728" observedRunningTime="2026-03-10 15:27:42.210229543 +0000 UTC m=+1295.375970451" watchObservedRunningTime="2026-03-10 15:27:42.222203485 +0000 UTC m=+1295.387944383" Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.990836 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.991985 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-log" containerID="cri-o://b73362a15e1523da055b029775569493559757e75675e22b21bdf30ecc6a3fbe" gracePeriod=30 Mar 10 15:27:42 crc kubenswrapper[4795]: I0310 15:27:42.992166 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-httpd" containerID="cri-o://4cc6b3909b7fae8597719b3798d9b553711ed002f88047133e6c9d36dd5de069" gracePeriod=30 Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.205687 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerID="b73362a15e1523da055b029775569493559757e75675e22b21bdf30ecc6a3fbe" exitCode=143 Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.205758 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerDied","Data":"b73362a15e1523da055b029775569493559757e75675e22b21bdf30ecc6a3fbe"} Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208749 4795 generic.go:334] "Generic (PLEG): container finished" podID="548bb169-84f7-432a-ba78-c5da18a0966d" containerID="5ca3b7c82ea996343a01c4d58acb11af8be186c5c35b7cfab528d88eb9254317" exitCode=0 Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208779 4795 generic.go:334] "Generic (PLEG): container finished" podID="548bb169-84f7-432a-ba78-c5da18a0966d" containerID="8dfaf95a6d854141cbcf95ff530d4ee917d61f933da74a4a5e1c63984fd2ca8e" exitCode=2 Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208787 4795 generic.go:334] "Generic (PLEG): container finished" podID="548bb169-84f7-432a-ba78-c5da18a0966d" containerID="2e9ca0c98eb5dfd3abd40faeedff900feceeba0be6380c6b7e2fa6464ee59359" exitCode=0 Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerDied","Data":"5ca3b7c82ea996343a01c4d58acb11af8be186c5c35b7cfab528d88eb9254317"} Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208829 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerDied","Data":"8dfaf95a6d854141cbcf95ff530d4ee917d61f933da74a4a5e1c63984fd2ca8e"} Mar 10 15:27:43 crc kubenswrapper[4795]: I0310 15:27:43.208838 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerDied","Data":"2e9ca0c98eb5dfd3abd40faeedff900feceeba0be6380c6b7e2fa6464ee59359"} Mar 10 15:27:44 crc kubenswrapper[4795]: I0310 15:27:44.217014 4795 generic.go:334] "Generic (PLEG): container finished" podID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerID="31f007b4e6d09cb48e4de2181e5984486844b6e2026abbca00e17b4e06a47d6b" exitCode=0 Mar 10 15:27:44 crc kubenswrapper[4795]: I0310 15:27:44.217123 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerDied","Data":"31f007b4e6d09cb48e4de2181e5984486844b6e2026abbca00e17b4e06a47d6b"} Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.057866 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6b868dfb95-x8b6r" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.118449 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.228077 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7968b684f6-5dwff" event={"ID":"8ebc3860-b7d7-4d35-b77a-3413647b0be4","Type":"ContainerDied","Data":"8849b7a7d4087ecf37b2349bfce97f7b07f41c816828d16297074301e4d9d48c"} Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.228126 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7968b684f6-5dwff" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.228138 4795 scope.go:117] "RemoveContainer" containerID="546cd9fc01e7eb5fb85f6cec9477bb99c629b460a781f59f1023b01c37c4fc90" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.258460 4795 scope.go:117] "RemoveContainer" containerID="31f007b4e6d09cb48e4de2181e5984486844b6e2026abbca00e17b4e06a47d6b" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.312060 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config\") pod \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.312444 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfsl\" (UniqueName: \"kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl\") pod \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.312521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config\") pod \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.312589 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle\") pod \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.312632 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs\") pod \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\" (UID: \"8ebc3860-b7d7-4d35-b77a-3413647b0be4\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.319208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8ebc3860-b7d7-4d35-b77a-3413647b0be4" (UID: "8ebc3860-b7d7-4d35-b77a-3413647b0be4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.319263 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl" (OuterVolumeSpecName: "kube-api-access-fgfsl") pod "8ebc3860-b7d7-4d35-b77a-3413647b0be4" (UID: "8ebc3860-b7d7-4d35-b77a-3413647b0be4"). InnerVolumeSpecName "kube-api-access-fgfsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.407759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config" (OuterVolumeSpecName: "config") pod "8ebc3860-b7d7-4d35-b77a-3413647b0be4" (UID: "8ebc3860-b7d7-4d35-b77a-3413647b0be4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.414499 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.414523 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgfsl\" (UniqueName: \"kubernetes.io/projected/8ebc3860-b7d7-4d35-b77a-3413647b0be4-kube-api-access-fgfsl\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.414533 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.415555 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ebc3860-b7d7-4d35-b77a-3413647b0be4" (UID: "8ebc3860-b7d7-4d35-b77a-3413647b0be4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417185 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ldnh5"] Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417550 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1de929ea-a6bc-48b9-9254-f0eaa6a73f36" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417562 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1de929ea-a6bc-48b9-9254-f0eaa6a73f36" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417575 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-api" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417582 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-api" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417593 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-httpd" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417600 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-httpd" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417615 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8412ff64-8bdb-4af0-a844-997ad007635e" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417621 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8412ff64-8bdb-4af0-a844-997ad007635e" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417633 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a4dce0f-e1c8-434c-b513-0f6f86d89099" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417639 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a4dce0f-e1c8-434c-b513-0f6f86d89099" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417654 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3002df7-165d-4f82-9e02-4d48bf960c87" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417660 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3002df7-165d-4f82-9e02-4d48bf960c87" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417676 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82d29f02-6127-4a6a-8df2-b541ce3ee733" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417681 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="82d29f02-6127-4a6a-8df2-b541ce3ee733" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: E0310 15:27:45.417693 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5678c7de-b0fe-4d07-a752-1cd7eea46db6" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417699 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5678c7de-b0fe-4d07-a752-1cd7eea46db6" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417850 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5678c7de-b0fe-4d07-a752-1cd7eea46db6" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417865 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a4dce0f-e1c8-434c-b513-0f6f86d89099" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417872 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-httpd" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417881 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" containerName="neutron-api" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417892 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8412ff64-8bdb-4af0-a844-997ad007635e" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417901 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1de929ea-a6bc-48b9-9254-f0eaa6a73f36" containerName="mariadb-database-create" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417912 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="82d29f02-6127-4a6a-8df2-b541ce3ee733" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.417921 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3002df7-165d-4f82-9e02-4d48bf960c87" containerName="mariadb-account-create-update" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.418526 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.420832 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.421163 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hcmrb" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.423594 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.455123 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ldnh5"] Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.480880 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8ebc3860-b7d7-4d35-b77a-3413647b0be4" (UID: "8ebc3860-b7d7-4d35-b77a-3413647b0be4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.530980 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.531033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.531098 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.531233 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksmwd\" (UniqueName: \"kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.534255 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.534281 4795 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8ebc3860-b7d7-4d35-b77a-3413647b0be4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.567811 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.575846 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7968b684f6-5dwff"] Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.636330 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.637442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.637482 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.637603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksmwd\" (UniqueName: \"kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.641244 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.641942 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.642457 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.656009 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksmwd\" (UniqueName: \"kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd\") pod \"nova-cell0-conductor-db-sync-ldnh5\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.698784 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739713 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739782 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739831 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739862 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.739981 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.740027 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.740102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4brgx\" (UniqueName: \"kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx\") pod \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\" (UID: \"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.740999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.741383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs" (OuterVolumeSpecName: "logs") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.745041 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.746704 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts" (OuterVolumeSpecName: "scripts") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.752204 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx" (OuterVolumeSpecName: "kube-api-access-4brgx") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "kube-api-access-4brgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.766396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.775046 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.813360 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data" (OuterVolumeSpecName: "config-data") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.814760 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" (UID: "8a9e70da-59e9-47fb-a8cd-89f3577ddcf8"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842017 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842050 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842060 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842085 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842093 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842102 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4brgx\" (UniqueName: \"kubernetes.io/projected/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-kube-api-access-4brgx\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842109 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.842118 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.861023 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.887216 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67547556b6-45876" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945102 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945141 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsf2c\" (UniqueName: \"kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945268 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945336 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945446 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key\") pod \"9c0c78d2-7838-4836-975b-87312ba1c49e\" (UID: \"9c0c78d2-7838-4836-975b-87312ba1c49e\") " Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.945803 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.951884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs" (OuterVolumeSpecName: "logs") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.955630 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c" (OuterVolumeSpecName: "kube-api-access-rsf2c") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "kube-api-access-rsf2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.955967 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.979469 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts" (OuterVolumeSpecName: "scripts") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.980349 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:45 crc kubenswrapper[4795]: I0310 15:27:45.994094 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data" (OuterVolumeSpecName: "config-data") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.021355 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "9c0c78d2-7838-4836-975b-87312ba1c49e" (UID: "9c0c78d2-7838-4836-975b-87312ba1c49e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048290 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c0c78d2-7838-4836-975b-87312ba1c49e-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048326 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048337 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9c0c78d2-7838-4836-975b-87312ba1c49e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048348 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048361 4795 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048370 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c0c78d2-7838-4836-975b-87312ba1c49e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.048381 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsf2c\" (UniqueName: \"kubernetes.io/projected/9c0c78d2-7838-4836-975b-87312ba1c49e-kube-api-access-rsf2c\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.242399 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerID="4cc6b3909b7fae8597719b3798d9b553711ed002f88047133e6c9d36dd5de069" exitCode=0 Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.242459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerDied","Data":"4cc6b3909b7fae8597719b3798d9b553711ed002f88047133e6c9d36dd5de069"} Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.248396 4795 generic.go:334] "Generic (PLEG): container finished" podID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerID="2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631" exitCode=137 Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.248487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerDied","Data":"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631"} Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.248526 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67547556b6-45876" event={"ID":"9c0c78d2-7838-4836-975b-87312ba1c49e","Type":"ContainerDied","Data":"5603890d477e9f6b0854b1cfcc7b7d716807379efd2cb6cfe7b92d745a827ab8"} Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.248538 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67547556b6-45876" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.248543 4795 scope.go:117] "RemoveContainer" containerID="d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.266646 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ldnh5"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.267614 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerID="ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2" exitCode=0 Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.267710 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.267789 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerDied","Data":"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2"} Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.267818 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8a9e70da-59e9-47fb-a8cd-89f3577ddcf8","Type":"ContainerDied","Data":"117f2de952018f33e00c22822eb7bad69e5fabe1de2950d37260c33cc292996f"} Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.320176 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.345142 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.353126 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.363463 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67547556b6-45876"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369190 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.369573 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon-log" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369591 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon-log" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.369600 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-log" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369606 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-log" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.369615 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369622 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.369639 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-httpd" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369645 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-httpd" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369818 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-log" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369831 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369840 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" containerName="horizon-log" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.369851 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" containerName="glance-httpd" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.370754 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.377963 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.378453 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.378786 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.511136 4795 scope.go:117] "RemoveContainer" containerID="2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.537250 4795 scope.go:117] "RemoveContainer" containerID="d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.538211 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0\": container with ID starting with d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0 not found: ID does not exist" containerID="d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.538256 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0"} err="failed to get container status \"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0\": rpc error: code = NotFound desc = could not find container \"d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0\": container with ID starting with d692d77b33ed9db6cf38a7a85cfcaf488f66fbb18654aa049022b7633093c4d0 not found: ID does not exist" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.538292 4795 scope.go:117] "RemoveContainer" containerID="2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.539226 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631\": container with ID starting with 2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631 not found: ID does not exist" containerID="2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.539267 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631"} err="failed to get container status \"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631\": rpc error: code = NotFound desc = could not find container \"2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631\": container with ID starting with 2f4f8a942a01467b58d6f901d1434556e72a72738fbad73826456294c4d55631 not found: ID does not exist" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.539295 4795 scope.go:117] "RemoveContainer" containerID="ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556421 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxfv\" (UniqueName: \"kubernetes.io/projected/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-kube-api-access-clxfv\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556527 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556569 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556619 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556655 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556678 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.556702 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-logs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.573192 4795 scope.go:117] "RemoveContainer" containerID="f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.592807 4795 scope.go:117] "RemoveContainer" containerID="ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.593567 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2\": container with ID starting with ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2 not found: ID does not exist" containerID="ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.593617 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2"} err="failed to get container status \"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2\": rpc error: code = NotFound desc = could not find container \"ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2\": container with ID starting with ae98ddff4462bcc29b0613a52595cf23aa40b0a5d28582bc56dd69d8affa74e2 not found: ID does not exist" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.593638 4795 scope.go:117] "RemoveContainer" containerID="f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058" Mar 10 15:27:46 crc kubenswrapper[4795]: E0310 15:27:46.593826 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058\": container with ID starting with f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058 not found: ID does not exist" containerID="f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.593848 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058"} err="failed to get container status \"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058\": rpc error: code = NotFound desc = could not find container \"f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058\": container with ID starting with f59e3a2a4fd012d239084481d74cd7004a6b0b25b5ab6002fd7f90a701b9c058 not found: ID does not exist" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658122 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658156 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658191 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-logs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658232 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxfv\" (UniqueName: \"kubernetes.io/projected/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-kube-api-access-clxfv\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658260 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658292 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658326 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.658847 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.659093 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-logs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.660351 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.663493 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.666135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-scripts\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.667336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-config-data\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.672417 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.684010 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxfv\" (UniqueName: \"kubernetes.io/projected/4f01cb9f-0169-4daa-b31f-0ab1e38d96ce-kube-api-access-clxfv\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.690292 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.693391 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce\") " pod="openstack/glance-default-external-api-0" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.860808 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.860882 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.860946 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.860966 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.860994 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861026 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861062 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861162 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvn66\" (UniqueName: \"kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66\") pod \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\" (UID: \"3e7caf4d-b232-4ef1-bcbb-9e11adf55727\") " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861359 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs" (OuterVolumeSpecName: "logs") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861542 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.861970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.866636 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.869384 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts" (OuterVolumeSpecName: "scripts") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.869693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66" (OuterVolumeSpecName: "kube-api-access-mvn66") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "kube-api-access-mvn66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.892216 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.915594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data" (OuterVolumeSpecName: "config-data") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.916132 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3e7caf4d-b232-4ef1-bcbb-9e11adf55727" (UID: "3e7caf4d-b232-4ef1-bcbb-9e11adf55727"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963207 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963240 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963249 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963260 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963270 4795 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963279 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvn66\" (UniqueName: \"kubernetes.io/projected/3e7caf4d-b232-4ef1-bcbb-9e11adf55727-kube-api-access-mvn66\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.963318 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.981193 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 10 15:27:46 crc kubenswrapper[4795]: I0310 15:27:46.996562 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.064954 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.298338 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" event={"ID":"7e6445e9-4e77-48dd-8550-e4068a6d9db2","Type":"ContainerStarted","Data":"5668094d2f63d8445a20c729b2a142af40e31963e44b398c4eab7c3cb68b5e23"} Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.308481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3e7caf4d-b232-4ef1-bcbb-9e11adf55727","Type":"ContainerDied","Data":"453e204a9b34daf87d6b4dba478697f2414d230aa2b56ac8e6b184deb8066183"} Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.308534 4795 scope.go:117] "RemoveContainer" containerID="4cc6b3909b7fae8597719b3798d9b553711ed002f88047133e6c9d36dd5de069" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.308736 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.342867 4795 scope.go:117] "RemoveContainer" containerID="b73362a15e1523da055b029775569493559757e75675e22b21bdf30ecc6a3fbe" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.355939 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.371769 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.432621 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:47 crc kubenswrapper[4795]: E0310 15:27:47.441293 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-httpd" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.441338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-httpd" Mar 10 15:27:47 crc kubenswrapper[4795]: E0310 15:27:47.441382 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-log" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.441388 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-log" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.441709 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-httpd" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.441739 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" containerName="glance-log" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.445636 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.447830 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.448374 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.455360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.506242 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e7caf4d-b232-4ef1-bcbb-9e11adf55727" path="/var/lib/kubelet/pods/3e7caf4d-b232-4ef1-bcbb-9e11adf55727/volumes" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.507807 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a9e70da-59e9-47fb-a8cd-89f3577ddcf8" path="/var/lib/kubelet/pods/8a9e70da-59e9-47fb-a8cd-89f3577ddcf8/volumes" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.510233 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebc3860-b7d7-4d35-b77a-3413647b0be4" path="/var/lib/kubelet/pods/8ebc3860-b7d7-4d35-b77a-3413647b0be4/volumes" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.517631 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c0c78d2-7838-4836-975b-87312ba1c49e" path="/var/lib/kubelet/pods/9c0c78d2-7838-4836-975b-87312ba1c49e/volumes" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.574889 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.574962 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575033 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575147 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575170 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dshzs\" (UniqueName: \"kubernetes.io/projected/15823ffe-ed18-451a-95e7-30ebee5218f3-kube-api-access-dshzs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575192 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.575598 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 10 15:27:47 crc kubenswrapper[4795]: W0310 15:27:47.575736 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f01cb9f_0169_4daa_b31f_0ab1e38d96ce.slice/crio-58e3cbbecb53d10fa3e7546f3a271f48cea52f770f7f31695e058a3a09992439 WatchSource:0}: Error finding container 58e3cbbecb53d10fa3e7546f3a271f48cea52f770f7f31695e058a3a09992439: Status 404 returned error can't find the container with id 58e3cbbecb53d10fa3e7546f3a271f48cea52f770f7f31695e058a3a09992439 Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677024 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677117 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677153 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677171 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677199 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677226 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dshzs\" (UniqueName: \"kubernetes.io/projected/15823ffe-ed18-451a-95e7-30ebee5218f3-kube-api-access-dshzs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.677412 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-logs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.678106 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.682424 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.682927 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/15823ffe-ed18-451a-95e7-30ebee5218f3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.688524 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.699776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.704798 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/15823ffe-ed18-451a-95e7-30ebee5218f3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.708746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dshzs\" (UniqueName: \"kubernetes.io/projected/15823ffe-ed18-451a-95e7-30ebee5218f3-kube-api-access-dshzs\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:47 crc kubenswrapper[4795]: I0310 15:27:47.778802 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"15823ffe-ed18-451a-95e7-30ebee5218f3\") " pod="openstack/glance-default-internal-api-0" Mar 10 15:27:48 crc kubenswrapper[4795]: I0310 15:27:48.078019 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:48 crc kubenswrapper[4795]: I0310 15:27:48.336312 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce","Type":"ContainerStarted","Data":"992a82de8ab8709c984de603b816e4f5977d6292a22f6009be5574e2550599ce"} Mar 10 15:27:48 crc kubenswrapper[4795]: I0310 15:27:48.336676 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce","Type":"ContainerStarted","Data":"58e3cbbecb53d10fa3e7546f3a271f48cea52f770f7f31695e058a3a09992439"} Mar 10 15:27:48 crc kubenswrapper[4795]: I0310 15:27:48.626572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 10 15:27:48 crc kubenswrapper[4795]: W0310 15:27:48.642709 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15823ffe_ed18_451a_95e7_30ebee5218f3.slice/crio-e678e742be53ad399f5515d952f8f07ab9f4624862e2793af9c3a97be96d07c5 WatchSource:0}: Error finding container e678e742be53ad399f5515d952f8f07ab9f4624862e2793af9c3a97be96d07c5: Status 404 returned error can't find the container with id e678e742be53ad399f5515d952f8f07ab9f4624862e2793af9c3a97be96d07c5 Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.355546 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.376395 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"4f01cb9f-0169-4daa-b31f-0ab1e38d96ce","Type":"ContainerStarted","Data":"fde50815edb27f21591b872973360d2639675884602a030812d86e54d3e46386"} Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.377747 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15823ffe-ed18-451a-95e7-30ebee5218f3","Type":"ContainerStarted","Data":"ce03208a0ccc636a46b92170b73d47f6f5debda736a7e3b8018f2281fc7ef420"} Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.377768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15823ffe-ed18-451a-95e7-30ebee5218f3","Type":"ContainerStarted","Data":"e678e742be53ad399f5515d952f8f07ab9f4624862e2793af9c3a97be96d07c5"} Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.381117 4795 generic.go:334] "Generic (PLEG): container finished" podID="548bb169-84f7-432a-ba78-c5da18a0966d" containerID="e564d3c4690955d1224e2817c97e7ed9adc93e938bb0a4973b1823bbfc60e183" exitCode=0 Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.381145 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerDied","Data":"e564d3c4690955d1224e2817c97e7ed9adc93e938bb0a4973b1823bbfc60e183"} Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.383825 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56fb779756-g2577" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.402720 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.402701101 podStartE2EDuration="3.402701101s" podCreationTimestamp="2026-03-10 15:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:49.390120151 +0000 UTC m=+1302.555861049" watchObservedRunningTime="2026-03-10 15:27:49.402701101 +0000 UTC m=+1302.568441999" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.482027 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.490947 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.491216 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7946f6d44-rgccw" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-log" containerID="cri-o://edb515c902b3c5c0d67470c88737b6d3184390a0ff6cbb702c71e344e7dee06b" gracePeriod=30 Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.491242 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-7946f6d44-rgccw" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-api" containerID="cri-o://cddd0c411964b371ac93f00d58eba7a6b7a5cfe9371f5cf3bd18b6b29a8e237c" gracePeriod=30 Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620425 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxhcp\" (UniqueName: \"kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620508 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620539 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620608 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620666 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620696 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620745 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd\") pod \"548bb169-84f7-432a-ba78-c5da18a0966d\" (UID: \"548bb169-84f7-432a-ba78-c5da18a0966d\") " Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.620856 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.621296 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.621605 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.621623 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/548bb169-84f7-432a-ba78-c5da18a0966d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.631306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts" (OuterVolumeSpecName: "scripts") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.631762 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp" (OuterVolumeSpecName: "kube-api-access-xxhcp") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "kube-api-access-xxhcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.664167 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.725934 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxhcp\" (UniqueName: \"kubernetes.io/projected/548bb169-84f7-432a-ba78-c5da18a0966d-kube-api-access-xxhcp\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.725974 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.725986 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.744192 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.748921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data" (OuterVolumeSpecName: "config-data") pod "548bb169-84f7-432a-ba78-c5da18a0966d" (UID: "548bb169-84f7-432a-ba78-c5da18a0966d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.827578 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:49 crc kubenswrapper[4795]: I0310 15:27:49.827611 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/548bb169-84f7-432a-ba78-c5da18a0966d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.400381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"548bb169-84f7-432a-ba78-c5da18a0966d","Type":"ContainerDied","Data":"4c876561ce0d295cfdb5f47dacfaa26915f93a73c5f0d0f770888a2c83e5048e"} Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.400733 4795 scope.go:117] "RemoveContainer" containerID="5ca3b7c82ea996343a01c4d58acb11af8be186c5c35b7cfab528d88eb9254317" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.400438 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.408991 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerID="edb515c902b3c5c0d67470c88737b6d3184390a0ff6cbb702c71e344e7dee06b" exitCode=143 Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.409081 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerDied","Data":"edb515c902b3c5c0d67470c88737b6d3184390a0ff6cbb702c71e344e7dee06b"} Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.414524 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"15823ffe-ed18-451a-95e7-30ebee5218f3","Type":"ContainerStarted","Data":"750d9fa078f7bde769a354c387d5d423f853ecac59205545160812b61d9ebf94"} Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.438391 4795 scope.go:117] "RemoveContainer" containerID="8dfaf95a6d854141cbcf95ff530d4ee917d61f933da74a4a5e1c63984fd2ca8e" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.439407 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.446455 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455036 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:50 crc kubenswrapper[4795]: E0310 15:27:50.455460 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="sg-core" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455486 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="sg-core" Mar 10 15:27:50 crc kubenswrapper[4795]: E0310 15:27:50.455516 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-central-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455525 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-central-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: E0310 15:27:50.455536 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="proxy-httpd" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455543 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="proxy-httpd" Mar 10 15:27:50 crc kubenswrapper[4795]: E0310 15:27:50.455551 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-notification-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455557 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-notification-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455720 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-central-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455733 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="ceilometer-notification-agent" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455746 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="proxy-httpd" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.455756 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" containerName="sg-core" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.468029 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.473813 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.474237 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.478817 4795 scope.go:117] "RemoveContainer" containerID="2e9ca0c98eb5dfd3abd40faeedff900feceeba0be6380c6b7e2fa6464ee59359" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.505508 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5054909690000002 podStartE2EDuration="3.505490969s" podCreationTimestamp="2026-03-10 15:27:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:27:50.467638336 +0000 UTC m=+1303.633379234" watchObservedRunningTime="2026-03-10 15:27:50.505490969 +0000 UTC m=+1303.671231877" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.520797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.526274 4795 scope.go:117] "RemoveContainer" containerID="e564d3c4690955d1224e2817c97e7ed9adc93e938bb0a4973b1823bbfc60e183" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540626 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540765 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540845 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540890 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.540988 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.541026 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqg5b\" (UniqueName: \"kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642521 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642555 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642610 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642631 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqg5b\" (UniqueName: \"kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.642766 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.643226 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.644294 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.655388 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.663856 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.664251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.666483 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.666666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqg5b\" (UniqueName: \"kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b\") pod \"ceilometer-0\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " pod="openstack/ceilometer-0" Mar 10 15:27:50 crc kubenswrapper[4795]: I0310 15:27:50.792376 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:27:51 crc kubenswrapper[4795]: I0310 15:27:51.269372 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:51 crc kubenswrapper[4795]: I0310 15:27:51.406846 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:27:51 crc kubenswrapper[4795]: I0310 15:27:51.487993 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="548bb169-84f7-432a-ba78-c5da18a0966d" path="/var/lib/kubelet/pods/548bb169-84f7-432a-ba78-c5da18a0966d/volumes" Mar 10 15:27:53 crc kubenswrapper[4795]: I0310 15:27:53.459623 4795 generic.go:334] "Generic (PLEG): container finished" podID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerID="cddd0c411964b371ac93f00d58eba7a6b7a5cfe9371f5cf3bd18b6b29a8e237c" exitCode=0 Mar 10 15:27:53 crc kubenswrapper[4795]: I0310 15:27:53.459711 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerDied","Data":"cddd0c411964b371ac93f00d58eba7a6b7a5cfe9371f5cf3bd18b6b29a8e237c"} Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.489240 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerStarted","Data":"b854afb0aee6616265a595abe0b0a268add429e6d601220e514470196c6f5108"} Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.719755 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841296 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsv9j\" (UniqueName: \"kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841377 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841618 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841742 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.841802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle\") pod \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\" (UID: \"5d18a18e-e6f3-4c43-a129-36d0f40e3a22\") " Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.842319 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs" (OuterVolumeSpecName: "logs") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.842519 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.846552 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j" (OuterVolumeSpecName: "kube-api-access-gsv9j") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "kube-api-access-gsv9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.849868 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts" (OuterVolumeSpecName: "scripts") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.886027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data" (OuterVolumeSpecName: "config-data") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.906281 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.944705 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsv9j\" (UniqueName: \"kubernetes.io/projected/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-kube-api-access-gsv9j\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.944736 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.944745 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.944784 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.950441 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:55 crc kubenswrapper[4795]: I0310 15:27:55.958482 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d18a18e-e6f3-4c43-a129-36d0f40e3a22" (UID: "5d18a18e-e6f3-4c43-a129-36d0f40e3a22"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.046523 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.046852 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d18a18e-e6f3-4c43-a129-36d0f40e3a22-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.504649 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7946f6d44-rgccw" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.504674 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7946f6d44-rgccw" event={"ID":"5d18a18e-e6f3-4c43-a129-36d0f40e3a22","Type":"ContainerDied","Data":"fa5a910646ab02725b14f67fb39ac155a34b09bc5d2bcbba72eb3eab9ee09a52"} Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.504764 4795 scope.go:117] "RemoveContainer" containerID="cddd0c411964b371ac93f00d58eba7a6b7a5cfe9371f5cf3bd18b6b29a8e237c" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.516221 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" event={"ID":"7e6445e9-4e77-48dd-8550-e4068a6d9db2","Type":"ContainerStarted","Data":"eb88599d2817a644775a8873925b1e9ee2221e005905f5e309974e191e26b13c"} Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.521614 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerStarted","Data":"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf"} Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.543036 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" podStartSLOduration=2.560699923 podStartE2EDuration="11.543015517s" podCreationTimestamp="2026-03-10 15:27:45 +0000 UTC" firstStartedPulling="2026-03-10 15:27:46.511302113 +0000 UTC m=+1299.677043001" lastFinishedPulling="2026-03-10 15:27:55.493617697 +0000 UTC m=+1308.659358595" observedRunningTime="2026-03-10 15:27:56.542285426 +0000 UTC m=+1309.708026324" watchObservedRunningTime="2026-03-10 15:27:56.543015517 +0000 UTC m=+1309.708756415" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.599405 4795 scope.go:117] "RemoveContainer" containerID="edb515c902b3c5c0d67470c88737b6d3184390a0ff6cbb702c71e344e7dee06b" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.633694 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.640885 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-7946f6d44-rgccw"] Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.997203 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:27:56 crc kubenswrapper[4795]: I0310 15:27:56.998806 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.027159 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.039492 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.498257 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" path="/var/lib/kubelet/pods/5d18a18e-e6f3-4c43-a129-36d0f40e3a22/volumes" Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.532683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerStarted","Data":"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b"} Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.534127 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:57 crc kubenswrapper[4795]: I0310 15:27:57.534168 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.078366 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.078799 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.147162 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.164976 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.546340 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerStarted","Data":"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d"} Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.547207 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:58 crc kubenswrapper[4795]: I0310 15:27:58.547225 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 10 15:27:59 crc kubenswrapper[4795]: I0310 15:27:59.557471 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:27:59 crc kubenswrapper[4795]: I0310 15:27:59.558181 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:27:59 crc kubenswrapper[4795]: I0310 15:27:59.607737 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:27:59 crc kubenswrapper[4795]: I0310 15:27:59.608344 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.127960 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552608-vwfhn"] Mar 10 15:28:00 crc kubenswrapper[4795]: E0310 15:28:00.128558 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-log" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.128577 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-log" Mar 10 15:28:00 crc kubenswrapper[4795]: E0310 15:28:00.128610 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-api" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.128616 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-api" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.128764 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-api" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.128792 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d18a18e-e6f3-4c43-a129-36d0f40e3a22" containerName="placement-log" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.129437 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.131515 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.131522 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.131533 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.143039 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-vwfhn"] Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.244778 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrpv7\" (UniqueName: \"kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7\") pod \"auto-csr-approver-29552608-vwfhn\" (UID: \"0c0d7784-e8d2-4b84-9332-185029fb41aa\") " pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.346929 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrpv7\" (UniqueName: \"kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7\") pod \"auto-csr-approver-29552608-vwfhn\" (UID: \"0c0d7784-e8d2-4b84-9332-185029fb41aa\") " pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.366031 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrpv7\" (UniqueName: \"kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7\") pod \"auto-csr-approver-29552608-vwfhn\" (UID: \"0c0d7784-e8d2-4b84-9332-185029fb41aa\") " pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.496420 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.569878 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerStarted","Data":"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066"} Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.569972 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.569993 4795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.570080 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-central-agent" containerID="cri-o://94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf" gracePeriod=30 Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.570116 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="sg-core" containerID="cri-o://9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d" gracePeriod=30 Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.570125 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="proxy-httpd" containerID="cri-o://cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066" gracePeriod=30 Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.570112 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.570148 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-notification-agent" containerID="cri-o://82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b" gracePeriod=30 Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.597946 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.028491997 podStartE2EDuration="10.597926521s" podCreationTimestamp="2026-03-10 15:27:50 +0000 UTC" firstStartedPulling="2026-03-10 15:27:55.404247018 +0000 UTC m=+1308.569987916" lastFinishedPulling="2026-03-10 15:27:59.973681542 +0000 UTC m=+1313.139422440" observedRunningTime="2026-03-10 15:28:00.593824224 +0000 UTC m=+1313.759565122" watchObservedRunningTime="2026-03-10 15:28:00.597926521 +0000 UTC m=+1313.763667419" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.619475 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:00 crc kubenswrapper[4795]: I0310 15:28:00.670603 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.004345 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-vwfhn"] Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.581147 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" event={"ID":"0c0d7784-e8d2-4b84-9332-185029fb41aa","Type":"ContainerStarted","Data":"ffd8c59b0ff3c04d875898e0082540d63d2306a8e114f290cea54085d65f9e6e"} Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.583057 4795 generic.go:334] "Generic (PLEG): container finished" podID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerID="cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066" exitCode=0 Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.583266 4795 generic.go:334] "Generic (PLEG): container finished" podID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerID="9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d" exitCode=2 Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.583276 4795 generic.go:334] "Generic (PLEG): container finished" podID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerID="82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b" exitCode=0 Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.583995 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerDied","Data":"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066"} Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.584031 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerDied","Data":"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d"} Mar 10 15:28:01 crc kubenswrapper[4795]: I0310 15:28:01.584045 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerDied","Data":"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b"} Mar 10 15:28:02 crc kubenswrapper[4795]: I0310 15:28:02.595126 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" event={"ID":"0c0d7784-e8d2-4b84-9332-185029fb41aa","Type":"ContainerStarted","Data":"dd4a52e971bad8551d909acd49617832f3d17b518ec47eb1d3ef2ee1788128c5"} Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.518856 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.547646 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" podStartSLOduration=2.632491411 podStartE2EDuration="3.547622788s" podCreationTimestamp="2026-03-10 15:28:00 +0000 UTC" firstStartedPulling="2026-03-10 15:28:01.013909509 +0000 UTC m=+1314.179650407" lastFinishedPulling="2026-03-10 15:28:01.929040886 +0000 UTC m=+1315.094781784" observedRunningTime="2026-03-10 15:28:02.611781699 +0000 UTC m=+1315.777522597" watchObservedRunningTime="2026-03-10 15:28:03.547622788 +0000 UTC m=+1316.713363686" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.604203 4795 generic.go:334] "Generic (PLEG): container finished" podID="0c0d7784-e8d2-4b84-9332-185029fb41aa" containerID="dd4a52e971bad8551d909acd49617832f3d17b518ec47eb1d3ef2ee1788128c5" exitCode=0 Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.604281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" event={"ID":"0c0d7784-e8d2-4b84-9332-185029fb41aa","Type":"ContainerDied","Data":"dd4a52e971bad8551d909acd49617832f3d17b518ec47eb1d3ef2ee1788128c5"} Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.607326 4795 generic.go:334] "Generic (PLEG): container finished" podID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerID="94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf" exitCode=0 Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.607369 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerDied","Data":"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf"} Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.607392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906f0eae-d178-4c9c-bb07-059c2b81a140","Type":"ContainerDied","Data":"b854afb0aee6616265a595abe0b0a268add429e6d601220e514470196c6f5108"} Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.607402 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.607409 4795 scope.go:117] "RemoveContainer" containerID="cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.619932 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620036 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqg5b\" (UniqueName: \"kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620163 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620210 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.620397 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml\") pod \"906f0eae-d178-4c9c-bb07-059c2b81a140\" (UID: \"906f0eae-d178-4c9c-bb07-059c2b81a140\") " Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.627921 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.628031 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.629509 4795 scope.go:117] "RemoveContainer" containerID="9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.631284 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b" (OuterVolumeSpecName: "kube-api-access-cqg5b") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "kube-api-access-cqg5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.640158 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts" (OuterVolumeSpecName: "scripts") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.703338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.722190 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.722218 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqg5b\" (UniqueName: \"kubernetes.io/projected/906f0eae-d178-4c9c-bb07-059c2b81a140-kube-api-access-cqg5b\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.722231 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906f0eae-d178-4c9c-bb07-059c2b81a140-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.722242 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.722252 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.728163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.739139 4795 scope.go:117] "RemoveContainer" containerID="82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.765156 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data" (OuterVolumeSpecName: "config-data") pod "906f0eae-d178-4c9c-bb07-059c2b81a140" (UID: "906f0eae-d178-4c9c-bb07-059c2b81a140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.772794 4795 scope.go:117] "RemoveContainer" containerID="94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.790450 4795 scope.go:117] "RemoveContainer" containerID="cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.791058 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066\": container with ID starting with cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066 not found: ID does not exist" containerID="cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791107 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066"} err="failed to get container status \"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066\": rpc error: code = NotFound desc = could not find container \"cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066\": container with ID starting with cc40483d8fab1c2f3de1d55e2c4f0f12fce0f6623091ef2c8a7948ff20d75066 not found: ID does not exist" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791136 4795 scope.go:117] "RemoveContainer" containerID="9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.791461 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d\": container with ID starting with 9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d not found: ID does not exist" containerID="9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791481 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d"} err="failed to get container status \"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d\": rpc error: code = NotFound desc = could not find container \"9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d\": container with ID starting with 9c13f9b0b82b0e9e778c1668bcf8f03968eb8ebb4434327be50f3515488b4b7d not found: ID does not exist" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791511 4795 scope.go:117] "RemoveContainer" containerID="82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.791795 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b\": container with ID starting with 82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b not found: ID does not exist" containerID="82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791836 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b"} err="failed to get container status \"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b\": rpc error: code = NotFound desc = could not find container \"82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b\": container with ID starting with 82e80da7058e512856ec90378293238be3ed925da4ef06425fb236ebe636a32b not found: ID does not exist" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.791865 4795 scope.go:117] "RemoveContainer" containerID="94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.792392 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf\": container with ID starting with 94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf not found: ID does not exist" containerID="94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.792432 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf"} err="failed to get container status \"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf\": rpc error: code = NotFound desc = could not find container \"94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf\": container with ID starting with 94756d0c69c1a96a7cf0cddd8fe6d8e825c495bfbc79a0f4276c8b32a6688bdf not found: ID does not exist" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.824310 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.824365 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906f0eae-d178-4c9c-bb07-059c2b81a140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.936387 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.944398 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.964931 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.965286 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-central-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965302 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-central-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.965321 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="sg-core" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965328 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="sg-core" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.965340 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-notification-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-notification-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: E0310 15:28:03.965359 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="proxy-httpd" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965365 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="proxy-httpd" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965554 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-central-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965569 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="ceilometer-notification-agent" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965577 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="sg-core" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.965590 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" containerName="proxy-httpd" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.967129 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.969338 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.971179 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:03 crc kubenswrapper[4795]: I0310 15:28:03.979927 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.134216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nglq\" (UniqueName: \"kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.134296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.134330 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.134354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.134390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.135643 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.135791 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236663 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nglq\" (UniqueName: \"kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236759 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236780 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.236824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.237196 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.237231 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.237248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.241610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.241711 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.241790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.243238 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.262973 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nglq\" (UniqueName: \"kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq\") pod \"ceilometer-0\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.280783 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.717761 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:04 crc kubenswrapper[4795]: W0310 15:28:04.741645 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d1443a_9a20_413c_9845_6b722f3070b1.slice/crio-8161e45d16ff951f97ff0affc5b1557e784551e866884dccefe4e9eb2bdba2f1 WatchSource:0}: Error finding container 8161e45d16ff951f97ff0affc5b1557e784551e866884dccefe4e9eb2bdba2f1: Status 404 returned error can't find the container with id 8161e45d16ff951f97ff0affc5b1557e784551e866884dccefe4e9eb2bdba2f1 Mar 10 15:28:04 crc kubenswrapper[4795]: I0310 15:28:04.871046 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.052029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrpv7\" (UniqueName: \"kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7\") pod \"0c0d7784-e8d2-4b84-9332-185029fb41aa\" (UID: \"0c0d7784-e8d2-4b84-9332-185029fb41aa\") " Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.058237 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7" (OuterVolumeSpecName: "kube-api-access-zrpv7") pod "0c0d7784-e8d2-4b84-9332-185029fb41aa" (UID: "0c0d7784-e8d2-4b84-9332-185029fb41aa"). InnerVolumeSpecName "kube-api-access-zrpv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.154850 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrpv7\" (UniqueName: \"kubernetes.io/projected/0c0d7784-e8d2-4b84-9332-185029fb41aa-kube-api-access-zrpv7\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.379256 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.486204 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906f0eae-d178-4c9c-bb07-059c2b81a140" path="/var/lib/kubelet/pods/906f0eae-d178-4c9c-bb07-059c2b81a140/volumes" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.633896 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.633900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552608-vwfhn" event={"ID":"0c0d7784-e8d2-4b84-9332-185029fb41aa","Type":"ContainerDied","Data":"ffd8c59b0ff3c04d875898e0082540d63d2306a8e114f290cea54085d65f9e6e"} Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.633949 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffd8c59b0ff3c04d875898e0082540d63d2306a8e114f290cea54085d65f9e6e" Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.635421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerStarted","Data":"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765"} Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.635462 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerStarted","Data":"8161e45d16ff951f97ff0affc5b1557e784551e866884dccefe4e9eb2bdba2f1"} Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.671520 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-h76ns"] Mar 10 15:28:05 crc kubenswrapper[4795]: I0310 15:28:05.679644 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552602-h76ns"] Mar 10 15:28:06 crc kubenswrapper[4795]: I0310 15:28:06.648206 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e6445e9-4e77-48dd-8550-e4068a6d9db2" containerID="eb88599d2817a644775a8873925b1e9ee2221e005905f5e309974e191e26b13c" exitCode=0 Mar 10 15:28:06 crc kubenswrapper[4795]: I0310 15:28:06.648284 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" event={"ID":"7e6445e9-4e77-48dd-8550-e4068a6d9db2","Type":"ContainerDied","Data":"eb88599d2817a644775a8873925b1e9ee2221e005905f5e309974e191e26b13c"} Mar 10 15:28:06 crc kubenswrapper[4795]: I0310 15:28:06.650956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerStarted","Data":"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398"} Mar 10 15:28:07 crc kubenswrapper[4795]: I0310 15:28:07.496182 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe4cc933-6bcb-4283-bca6-8645ac162270" path="/var/lib/kubelet/pods/fe4cc933-6bcb-4283-bca6-8645ac162270/volumes" Mar 10 15:28:07 crc kubenswrapper[4795]: I0310 15:28:07.662870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerStarted","Data":"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256"} Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.004490 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.109990 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle\") pod \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.110038 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data\") pod \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.110082 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts\") pod \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.110164 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksmwd\" (UniqueName: \"kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd\") pod \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\" (UID: \"7e6445e9-4e77-48dd-8550-e4068a6d9db2\") " Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.115698 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd" (OuterVolumeSpecName: "kube-api-access-ksmwd") pod "7e6445e9-4e77-48dd-8550-e4068a6d9db2" (UID: "7e6445e9-4e77-48dd-8550-e4068a6d9db2"). InnerVolumeSpecName "kube-api-access-ksmwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.117306 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts" (OuterVolumeSpecName: "scripts") pod "7e6445e9-4e77-48dd-8550-e4068a6d9db2" (UID: "7e6445e9-4e77-48dd-8550-e4068a6d9db2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.143407 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data" (OuterVolumeSpecName: "config-data") pod "7e6445e9-4e77-48dd-8550-e4068a6d9db2" (UID: "7e6445e9-4e77-48dd-8550-e4068a6d9db2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.152455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e6445e9-4e77-48dd-8550-e4068a6d9db2" (UID: "7e6445e9-4e77-48dd-8550-e4068a6d9db2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.211807 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.211840 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.211851 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e6445e9-4e77-48dd-8550-e4068a6d9db2-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.211860 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksmwd\" (UniqueName: \"kubernetes.io/projected/7e6445e9-4e77-48dd-8550-e4068a6d9db2-kube-api-access-ksmwd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.675768 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" event={"ID":"7e6445e9-4e77-48dd-8550-e4068a6d9db2","Type":"ContainerDied","Data":"5668094d2f63d8445a20c729b2a142af40e31963e44b398c4eab7c3cb68b5e23"} Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.675817 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5668094d2f63d8445a20c729b2a142af40e31963e44b398c4eab7c3cb68b5e23" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.675924 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-ldnh5" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.776881 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:28:08 crc kubenswrapper[4795]: E0310 15:28:08.777777 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0d7784-e8d2-4b84-9332-185029fb41aa" containerName="oc" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.777807 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0d7784-e8d2-4b84-9332-185029fb41aa" containerName="oc" Mar 10 15:28:08 crc kubenswrapper[4795]: E0310 15:28:08.777876 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e6445e9-4e77-48dd-8550-e4068a6d9db2" containerName="nova-cell0-conductor-db-sync" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.777894 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e6445e9-4e77-48dd-8550-e4068a6d9db2" containerName="nova-cell0-conductor-db-sync" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.778282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e6445e9-4e77-48dd-8550-e4068a6d9db2" containerName="nova-cell0-conductor-db-sync" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.778318 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0d7784-e8d2-4b84-9332-185029fb41aa" containerName="oc" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.779400 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.782341 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-hcmrb" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.782603 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.800519 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.924755 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.924817 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlztp\" (UniqueName: \"kubernetes.io/projected/3097e989-78f5-453d-a495-fc8b4a805efb-kube-api-access-wlztp\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:08 crc kubenswrapper[4795]: I0310 15:28:08.925036 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.026499 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.026573 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlztp\" (UniqueName: \"kubernetes.io/projected/3097e989-78f5-453d-a495-fc8b4a805efb-kube-api-access-wlztp\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.026672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.033112 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.041950 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3097e989-78f5-453d-a495-fc8b4a805efb-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.045898 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlztp\" (UniqueName: \"kubernetes.io/projected/3097e989-78f5-453d-a495-fc8b4a805efb-kube-api-access-wlztp\") pod \"nova-cell0-conductor-0\" (UID: \"3097e989-78f5-453d-a495-fc8b4a805efb\") " pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.100712 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.580619 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 10 15:28:09 crc kubenswrapper[4795]: I0310 15:28:09.689174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3097e989-78f5-453d-a495-fc8b4a805efb","Type":"ContainerStarted","Data":"c9de5fc570965c998aca8a08e1b8724cc23cac74b91a12692d8343b87eb890d2"} Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.703615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3097e989-78f5-453d-a495-fc8b4a805efb","Type":"ContainerStarted","Data":"68d47d3682fb26ff70c4c0c2bc89b0966faaab169c5f7c8ae95dfb867efb5f2b"} Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.703956 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708612 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerStarted","Data":"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c"} Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708843 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708896 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-notification-agent" containerID="cri-o://fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398" gracePeriod=30 Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708920 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-central-agent" containerID="cri-o://bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765" gracePeriod=30 Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708899 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="sg-core" containerID="cri-o://99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256" gracePeriod=30 Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.708890 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="proxy-httpd" containerID="cri-o://a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c" gracePeriod=30 Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.751292 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.751265337 podStartE2EDuration="2.751265337s" podCreationTimestamp="2026-03-10 15:28:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:10.727894388 +0000 UTC m=+1323.893635296" watchObservedRunningTime="2026-03-10 15:28:10.751265337 +0000 UTC m=+1323.917006255" Mar 10 15:28:10 crc kubenswrapper[4795]: I0310 15:28:10.754417 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.918502027 podStartE2EDuration="7.754406337s" podCreationTimestamp="2026-03-10 15:28:03 +0000 UTC" firstStartedPulling="2026-03-10 15:28:04.743692587 +0000 UTC m=+1317.909433485" lastFinishedPulling="2026-03-10 15:28:09.579596887 +0000 UTC m=+1322.745337795" observedRunningTime="2026-03-10 15:28:10.753913853 +0000 UTC m=+1323.919654751" watchObservedRunningTime="2026-03-10 15:28:10.754406337 +0000 UTC m=+1323.920147245" Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.719948 4795 generic.go:334] "Generic (PLEG): container finished" podID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerID="a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c" exitCode=0 Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.719991 4795 generic.go:334] "Generic (PLEG): container finished" podID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerID="99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256" exitCode=2 Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.720002 4795 generic.go:334] "Generic (PLEG): container finished" podID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerID="fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398" exitCode=0 Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.720021 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerDied","Data":"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c"} Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.720094 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerDied","Data":"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256"} Mar 10 15:28:11 crc kubenswrapper[4795]: I0310 15:28:11.720114 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerDied","Data":"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398"} Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.222536 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.394940 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395280 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395445 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395499 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395611 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nglq\" (UniqueName: \"kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.395715 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml\") pod \"a3d1443a-9a20-413c-9845-6b722f3070b1\" (UID: \"a3d1443a-9a20-413c-9845-6b722f3070b1\") " Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.396037 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.396244 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.396945 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.396976 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a3d1443a-9a20-413c-9845-6b722f3070b1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.400750 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts" (OuterVolumeSpecName: "scripts") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.407362 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq" (OuterVolumeSpecName: "kube-api-access-9nglq") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "kube-api-access-9nglq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.448474 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.489092 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.499104 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nglq\" (UniqueName: \"kubernetes.io/projected/a3d1443a-9a20-413c-9845-6b722f3070b1-kube-api-access-9nglq\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.499171 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.499191 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.499210 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.535884 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data" (OuterVolumeSpecName: "config-data") pod "a3d1443a-9a20-413c-9845-6b722f3070b1" (UID: "a3d1443a-9a20-413c-9845-6b722f3070b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.601478 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1443a-9a20-413c-9845-6b722f3070b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.733186 4795 generic.go:334] "Generic (PLEG): container finished" podID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerID="bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765" exitCode=0 Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.733234 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerDied","Data":"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765"} Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.733270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a3d1443a-9a20-413c-9845-6b722f3070b1","Type":"ContainerDied","Data":"8161e45d16ff951f97ff0affc5b1557e784551e866884dccefe4e9eb2bdba2f1"} Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.733271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.733287 4795 scope.go:117] "RemoveContainer" containerID="a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.773510 4795 scope.go:117] "RemoveContainer" containerID="99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.794202 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.812855 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.827690 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.828990 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="sg-core" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.829033 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="sg-core" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.829102 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="proxy-httpd" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.829121 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="proxy-httpd" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.829148 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-notification-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.829164 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-notification-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.829218 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-central-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.829236 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-central-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.830117 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="proxy-httpd" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.830197 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="sg-core" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.830234 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-notification-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.830271 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" containerName="ceilometer-central-agent" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.839898 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.840050 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.843487 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.844282 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.844727 4795 scope.go:117] "RemoveContainer" containerID="fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.903025 4795 scope.go:117] "RemoveContainer" containerID="bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.939487 4795 scope.go:117] "RemoveContainer" containerID="a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.940223 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c\": container with ID starting with a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c not found: ID does not exist" containerID="a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.940258 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c"} err="failed to get container status \"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c\": rpc error: code = NotFound desc = could not find container \"a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c\": container with ID starting with a0bf8b3720eba2ae90ca6e7e4fd9d2fa0a9646572a5b4764c7825b6603f51f6c not found: ID does not exist" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.940279 4795 scope.go:117] "RemoveContainer" containerID="99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.940533 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256\": container with ID starting with 99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256 not found: ID does not exist" containerID="99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.940555 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256"} err="failed to get container status \"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256\": rpc error: code = NotFound desc = could not find container \"99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256\": container with ID starting with 99e98fd0e1c659fb5e83d7470df97839b506e5f0e537e14c3a02f218c8163256 not found: ID does not exist" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.940568 4795 scope.go:117] "RemoveContainer" containerID="fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.940766 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398\": container with ID starting with fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398 not found: ID does not exist" containerID="fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.956546 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398"} err="failed to get container status \"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398\": rpc error: code = NotFound desc = could not find container \"fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398\": container with ID starting with fe8fbeb3584de5b3ad4a6cff7d094676a0777eaa9790ad85af2327cc08cc2398 not found: ID does not exist" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.956597 4795 scope.go:117] "RemoveContainer" containerID="bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765" Mar 10 15:28:12 crc kubenswrapper[4795]: E0310 15:28:12.958294 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765\": container with ID starting with bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765 not found: ID does not exist" containerID="bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765" Mar 10 15:28:12 crc kubenswrapper[4795]: I0310 15:28:12.958339 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765"} err="failed to get container status \"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765\": rpc error: code = NotFound desc = could not find container \"bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765\": container with ID starting with bfeb73f6e5e28c2c5c54d8080ed908c4e10b9c750bfd87d59b8a74f600f8b765 not found: ID does not exist" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010318 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010381 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010403 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010472 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010493 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.010691 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqgl\" (UniqueName: \"kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.112717 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.112764 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.112813 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.112836 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.113335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.113382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.113495 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.113622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqgl\" (UniqueName: \"kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.113886 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.117656 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.118603 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.119183 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.132494 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.135735 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqgl\" (UniqueName: \"kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl\") pod \"ceilometer-0\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.200279 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.492369 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3d1443a-9a20-413c-9845-6b722f3070b1" path="/var/lib/kubelet/pods/a3d1443a-9a20-413c-9845-6b722f3070b1/volumes" Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.649968 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:13 crc kubenswrapper[4795]: W0310 15:28:13.659947 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b1c8dd0_a8e7_46c2_b9bc_6159f75c5eb4.slice/crio-825dcdc8eecde51f5ce9cdf163735c3fba9cdd5dd9059634e15d62560d777b6d WatchSource:0}: Error finding container 825dcdc8eecde51f5ce9cdf163735c3fba9cdd5dd9059634e15d62560d777b6d: Status 404 returned error can't find the container with id 825dcdc8eecde51f5ce9cdf163735c3fba9cdd5dd9059634e15d62560d777b6d Mar 10 15:28:13 crc kubenswrapper[4795]: I0310 15:28:13.751819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerStarted","Data":"825dcdc8eecde51f5ce9cdf163735c3fba9cdd5dd9059634e15d62560d777b6d"} Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.148573 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.627251 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-rfcwp"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.629937 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.633479 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.633916 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.671728 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rfcwp"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.770137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerStarted","Data":"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771"} Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.775525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.775581 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4pjp\" (UniqueName: \"kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.775612 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.775637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.812254 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.814455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.817615 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.834192 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.835612 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.844414 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.845052 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.865534 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.878278 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.878324 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4pjp\" (UniqueName: \"kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.878356 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.878382 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.891455 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.891850 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.915210 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.923225 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4pjp\" (UniqueName: \"kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp\") pod \"nova-cell0-cell-mapping-rfcwp\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.932262 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.933775 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.946759 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.976088 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.976607 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979800 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979844 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgc5h\" (UniqueName: \"kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979872 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979892 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979913 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fhfm\" (UniqueName: \"kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979941 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:14 crc kubenswrapper[4795]: I0310 15:28:14.979994 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.008183 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.009813 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.016669 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.016943 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.073996 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.075838 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082348 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54gjg\" (UniqueName: \"kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082391 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082418 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082449 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082502 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgc5h\" (UniqueName: \"kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082551 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082574 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fhfm\" (UniqueName: \"kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082627 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.082685 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.089672 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.092710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.092744 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.093542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.096880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.099360 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.117255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgc5h\" (UniqueName: \"kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h\") pod \"nova-scheduler-0\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.121584 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fhfm\" (UniqueName: \"kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm\") pod \"nova-api-0\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.151821 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186512 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186542 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186624 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186672 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54gjg\" (UniqueName: \"kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186688 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186708 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186725 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186751 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186774 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9nq2\" (UniqueName: \"kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.186813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jqn\" (UniqueName: \"kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.188350 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.209636 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.215694 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.238260 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54gjg\" (UniqueName: \"kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg\") pod \"nova-metadata-0\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9nq2\" (UniqueName: \"kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288800 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288824 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jqn\" (UniqueName: \"kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288862 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288904 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.288991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.289043 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.290116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.291014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.291573 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.291852 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.292116 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.306762 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.310083 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.313880 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.316962 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.335724 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9nq2\" (UniqueName: \"kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2\") pod \"dnsmasq-dns-bccf8f775-5nhjm\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.359698 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jqn\" (UniqueName: \"kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn\") pod \"nova-cell1-novncproxy-0\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.422867 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.453579 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.783642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerStarted","Data":"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81"} Mar 10 15:28:15 crc kubenswrapper[4795]: I0310 15:28:15.815326 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-rfcwp"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.038105 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkhtd"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.042037 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.045129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.045410 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 10 15:28:16 crc kubenswrapper[4795]: W0310 15:28:16.046788 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec8800c2_ae4b_42fa_bcf4_56eaf7c26847.slice/crio-9ec5a27314da3e1d73ed839c84a93df6f36d019e0e18566ec6cb607176803bf2 WatchSource:0}: Error finding container 9ec5a27314da3e1d73ed839c84a93df6f36d019e0e18566ec6cb607176803bf2: Status 404 returned error can't find the container with id 9ec5a27314da3e1d73ed839c84a93df6f36d019e0e18566ec6cb607176803bf2 Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.053174 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkhtd"] Mar 10 15:28:16 crc kubenswrapper[4795]: W0310 15:28:16.055698 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a0de041_d7c3_4b89_bd34_5af2dda7539e.slice/crio-64569b9b22e7e3f7ebc67a1a33f5b752ffae5776bbe51318d4a9fca68e821dec WatchSource:0}: Error finding container 64569b9b22e7e3f7ebc67a1a33f5b752ffae5776bbe51318d4a9fca68e821dec: Status 404 returned error can't find the container with id 64569b9b22e7e3f7ebc67a1a33f5b752ffae5776bbe51318d4a9fca68e821dec Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.065438 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.073844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.222705 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.225844 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.226079 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsgxm\" (UniqueName: \"kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.226134 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.226185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: W0310 15:28:16.236599 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddebc5f2c_40e4_4d20_839a_cdf86203c33c.slice/crio-369a058ea930b44659103639cb8415b355feed719d4e9ce23fb9ba54e7cc659d WatchSource:0}: Error finding container 369a058ea930b44659103639cb8415b355feed719d4e9ce23fb9ba54e7cc659d: Status 404 returned error can't find the container with id 369a058ea930b44659103639cb8415b355feed719d4e9ce23fb9ba54e7cc659d Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.242706 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.263220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.328054 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsgxm\" (UniqueName: \"kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.328661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.328713 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.328756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.333368 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.334631 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.341015 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.346410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsgxm\" (UniqueName: \"kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm\") pod \"nova-cell1-conductor-db-sync-tkhtd\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.462569 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.801588 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rfcwp" event={"ID":"264227be-e2df-4b25-bfff-14226b9f6703","Type":"ContainerStarted","Data":"83cd94d127b49631eeacb75b13436fbf19766ce9c3251ca5bd728af1e189b433"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.801854 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rfcwp" event={"ID":"264227be-e2df-4b25-bfff-14226b9f6703","Type":"ContainerStarted","Data":"f3f8fbb3483aca6d0e31f5921898fa15db8baef2a6aaf3e7da3cef7adeeb7330"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.803349 4795 generic.go:334] "Generic (PLEG): container finished" podID="e98749e7-dda3-43df-af97-4b521fa4e634" containerID="868aefdb0fb2f3561383295a8d1fd049876743a0bdaec8f07d643efc057493fa" exitCode=0 Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.803516 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" event={"ID":"e98749e7-dda3-43df-af97-4b521fa4e634","Type":"ContainerDied","Data":"868aefdb0fb2f3561383295a8d1fd049876743a0bdaec8f07d643efc057493fa"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.803561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" event={"ID":"e98749e7-dda3-43df-af97-4b521fa4e634","Type":"ContainerStarted","Data":"6840e02fd5ab166dd92cd7941c4d250caab3a3b2c42756898df394172c890896"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.805219 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e4c4f75-b00b-414d-89c0-05016ae6fa92","Type":"ContainerStarted","Data":"5b255c7ac968a4fe67e2b8d32a7380e773eb4b56858c9cbc1c4c481b644ded00"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.807971 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerStarted","Data":"369a058ea930b44659103639cb8415b355feed719d4e9ce23fb9ba54e7cc659d"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.810935 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerStarted","Data":"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.820139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a0de041-d7c3-4b89-bd34-5af2dda7539e","Type":"ContainerStarted","Data":"64569b9b22e7e3f7ebc67a1a33f5b752ffae5776bbe51318d4a9fca68e821dec"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.822973 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerStarted","Data":"9ec5a27314da3e1d73ed839c84a93df6f36d019e0e18566ec6cb607176803bf2"} Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.823544 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-rfcwp" podStartSLOduration=2.82353362 podStartE2EDuration="2.82353362s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:16.816962522 +0000 UTC m=+1329.982703420" watchObservedRunningTime="2026-03-10 15:28:16.82353362 +0000 UTC m=+1329.989274518" Mar 10 15:28:16 crc kubenswrapper[4795]: I0310 15:28:16.965585 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkhtd"] Mar 10 15:28:16 crc kubenswrapper[4795]: W0310 15:28:16.975212 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a0ed63_687e_4e41_8c99_e299cb991e17.slice/crio-94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf WatchSource:0}: Error finding container 94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf: Status 404 returned error can't find the container with id 94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.856545 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" event={"ID":"e98749e7-dda3-43df-af97-4b521fa4e634","Type":"ContainerStarted","Data":"23fc78b8ef704320c66fcd5bc9599db62c5fce5b6de059470ca85a5f59b13811"} Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.857156 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.860677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" event={"ID":"d1a0ed63-687e-4e41-8c99-e299cb991e17","Type":"ContainerStarted","Data":"313c53deb402c3cc0fa68eb9bd3743342a94414c83f34f32ac38395591172e75"} Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.860722 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" event={"ID":"d1a0ed63-687e-4e41-8c99-e299cb991e17","Type":"ContainerStarted","Data":"94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf"} Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.891950 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" podStartSLOduration=2.8919315340000002 podStartE2EDuration="2.891931534s" podCreationTimestamp="2026-03-10 15:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:17.883381049 +0000 UTC m=+1331.049121947" watchObservedRunningTime="2026-03-10 15:28:17.891931534 +0000 UTC m=+1331.057672432" Mar 10 15:28:17 crc kubenswrapper[4795]: I0310 15:28:17.901251 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" podStartSLOduration=1.9012334800000001 podStartE2EDuration="1.90123348s" podCreationTimestamp="2026-03-10 15:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:17.895446784 +0000 UTC m=+1331.061187682" watchObservedRunningTime="2026-03-10 15:28:17.90123348 +0000 UTC m=+1331.066974378" Mar 10 15:28:18 crc kubenswrapper[4795]: I0310 15:28:18.411483 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:18 crc kubenswrapper[4795]: I0310 15:28:18.424898 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.894403 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerStarted","Data":"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.894986 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerStarted","Data":"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.894469 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-metadata" containerID="cri-o://0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" gracePeriod=30 Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.894419 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-log" containerID="cri-o://335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" gracePeriod=30 Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.903587 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerStarted","Data":"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.905046 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.907769 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a0de041-d7c3-4b89-bd34-5af2dda7539e","Type":"ContainerStarted","Data":"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.916361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerStarted","Data":"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.916409 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerStarted","Data":"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.918770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e4c4f75-b00b-414d-89c0-05016ae6fa92","Type":"ContainerStarted","Data":"1a9c9b167e823eb5c00e5d22c93b25d39a5e09ba225d873cf169298953242506"} Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.918856 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1a9c9b167e823eb5c00e5d22c93b25d39a5e09ba225d873cf169298953242506" gracePeriod=30 Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.930945 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.594217731 podStartE2EDuration="6.930925757s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="2026-03-10 15:28:16.241600612 +0000 UTC m=+1329.407341510" lastFinishedPulling="2026-03-10 15:28:19.578308638 +0000 UTC m=+1332.744049536" observedRunningTime="2026-03-10 15:28:20.922975149 +0000 UTC m=+1334.088716067" watchObservedRunningTime="2026-03-10 15:28:20.930925757 +0000 UTC m=+1334.096666655" Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.946930 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.384874698 podStartE2EDuration="6.946912024s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="2026-03-10 15:28:16.053883259 +0000 UTC m=+1329.219624157" lastFinishedPulling="2026-03-10 15:28:19.615920585 +0000 UTC m=+1332.781661483" observedRunningTime="2026-03-10 15:28:20.945960087 +0000 UTC m=+1334.111700985" watchObservedRunningTime="2026-03-10 15:28:20.946912024 +0000 UTC m=+1334.112652922" Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.965652 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.5606337789999998 podStartE2EDuration="6.96563717s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="2026-03-10 15:28:16.210255025 +0000 UTC m=+1329.375995923" lastFinishedPulling="2026-03-10 15:28:19.615258406 +0000 UTC m=+1332.780999314" observedRunningTime="2026-03-10 15:28:20.960186504 +0000 UTC m=+1334.125927402" watchObservedRunningTime="2026-03-10 15:28:20.96563717 +0000 UTC m=+1334.131378068" Mar 10 15:28:20 crc kubenswrapper[4795]: I0310 15:28:20.987035 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.037362229 podStartE2EDuration="8.987013462s" podCreationTimestamp="2026-03-10 15:28:12 +0000 UTC" firstStartedPulling="2026-03-10 15:28:13.664719747 +0000 UTC m=+1326.830460655" lastFinishedPulling="2026-03-10 15:28:19.614371 +0000 UTC m=+1332.780111888" observedRunningTime="2026-03-10 15:28:20.978865999 +0000 UTC m=+1334.144606897" watchObservedRunningTime="2026-03-10 15:28:20.987013462 +0000 UTC m=+1334.152754360" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.000646 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.4714879180000002 podStartE2EDuration="7.000629302s" podCreationTimestamp="2026-03-10 15:28:14 +0000 UTC" firstStartedPulling="2026-03-10 15:28:16.058602554 +0000 UTC m=+1329.224343452" lastFinishedPulling="2026-03-10 15:28:19.587743938 +0000 UTC m=+1332.753484836" observedRunningTime="2026-03-10 15:28:20.999221312 +0000 UTC m=+1334.164962210" watchObservedRunningTime="2026-03-10 15:28:21.000629302 +0000 UTC m=+1334.166370200" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.520191 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.647983 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54gjg\" (UniqueName: \"kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg\") pod \"debc5f2c-40e4-4d20-839a-cdf86203c33c\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.648148 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data\") pod \"debc5f2c-40e4-4d20-839a-cdf86203c33c\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.648238 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs\") pod \"debc5f2c-40e4-4d20-839a-cdf86203c33c\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.648298 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle\") pod \"debc5f2c-40e4-4d20-839a-cdf86203c33c\" (UID: \"debc5f2c-40e4-4d20-839a-cdf86203c33c\") " Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.648584 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs" (OuterVolumeSpecName: "logs") pod "debc5f2c-40e4-4d20-839a-cdf86203c33c" (UID: "debc5f2c-40e4-4d20-839a-cdf86203c33c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.649043 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/debc5f2c-40e4-4d20-839a-cdf86203c33c-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.658587 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg" (OuterVolumeSpecName: "kube-api-access-54gjg") pod "debc5f2c-40e4-4d20-839a-cdf86203c33c" (UID: "debc5f2c-40e4-4d20-839a-cdf86203c33c"). InnerVolumeSpecName "kube-api-access-54gjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.683414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data" (OuterVolumeSpecName: "config-data") pod "debc5f2c-40e4-4d20-839a-cdf86203c33c" (UID: "debc5f2c-40e4-4d20-839a-cdf86203c33c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.697981 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "debc5f2c-40e4-4d20-839a-cdf86203c33c" (UID: "debc5f2c-40e4-4d20-839a-cdf86203c33c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.751049 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.751133 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54gjg\" (UniqueName: \"kubernetes.io/projected/debc5f2c-40e4-4d20-839a-cdf86203c33c-kube-api-access-54gjg\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.751158 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/debc5f2c-40e4-4d20-839a-cdf86203c33c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.931622 4795 generic.go:334] "Generic (PLEG): container finished" podID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerID="0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" exitCode=0 Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.932634 4795 generic.go:334] "Generic (PLEG): container finished" podID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerID="335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" exitCode=143 Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.933090 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerDied","Data":"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a"} Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.933137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerDied","Data":"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7"} Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.933153 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"debc5f2c-40e4-4d20-839a-cdf86203c33c","Type":"ContainerDied","Data":"369a058ea930b44659103639cb8415b355feed719d4e9ce23fb9ba54e7cc659d"} Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.933152 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.933171 4795 scope.go:117] "RemoveContainer" containerID="0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" Mar 10 15:28:21 crc kubenswrapper[4795]: I0310 15:28:21.981674 4795 scope.go:117] "RemoveContainer" containerID="335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:21.998807 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.011672 4795 scope.go:117] "RemoveContainer" containerID="0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.012277 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:22 crc kubenswrapper[4795]: E0310 15:28:22.015942 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a\": container with ID starting with 0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a not found: ID does not exist" containerID="0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.016009 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a"} err="failed to get container status \"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a\": rpc error: code = NotFound desc = could not find container \"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a\": container with ID starting with 0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a not found: ID does not exist" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.016046 4795 scope.go:117] "RemoveContainer" containerID="335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" Mar 10 15:28:22 crc kubenswrapper[4795]: E0310 15:28:22.017276 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7\": container with ID starting with 335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7 not found: ID does not exist" containerID="335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.017321 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7"} err="failed to get container status \"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7\": rpc error: code = NotFound desc = could not find container \"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7\": container with ID starting with 335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7 not found: ID does not exist" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.017348 4795 scope.go:117] "RemoveContainer" containerID="0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.017996 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a"} err="failed to get container status \"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a\": rpc error: code = NotFound desc = could not find container \"0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a\": container with ID starting with 0ca73eb9ed35bdc3eecbb1ffe8ca0fab716bbc7386a81795132c22a0f82aa43a not found: ID does not exist" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.018029 4795 scope.go:117] "RemoveContainer" containerID="335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.018374 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7"} err="failed to get container status \"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7\": rpc error: code = NotFound desc = could not find container \"335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7\": container with ID starting with 335c0d8eaf38adf3bd6f4e4c3f202e6e0dc42efdfb7ea02dc9794321c2e0baf7 not found: ID does not exist" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.020642 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:22 crc kubenswrapper[4795]: E0310 15:28:22.021079 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-metadata" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.021097 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-metadata" Mar 10 15:28:22 crc kubenswrapper[4795]: E0310 15:28:22.021112 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-log" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.021119 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-log" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.021282 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-log" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.021303 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" containerName="nova-metadata-metadata" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.022261 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.026003 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.026270 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.029516 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.160839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.161242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.161349 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.161374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctth4\" (UniqueName: \"kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.161468 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.263954 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.264187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.264279 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.264403 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.264447 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctth4\" (UniqueName: \"kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.264877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.269998 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.271298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.277506 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.285979 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctth4\" (UniqueName: \"kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4\") pod \"nova-metadata-0\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.341630 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.819147 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:22 crc kubenswrapper[4795]: W0310 15:28:22.831230 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf407307e_9a57_4383_a1df_d87c5adb519b.slice/crio-9badb3fb5863a1cb691c99822240229f3947c78d5d6fbaf6de9a806ba65b7985 WatchSource:0}: Error finding container 9badb3fb5863a1cb691c99822240229f3947c78d5d6fbaf6de9a806ba65b7985: Status 404 returned error can't find the container with id 9badb3fb5863a1cb691c99822240229f3947c78d5d6fbaf6de9a806ba65b7985 Mar 10 15:28:22 crc kubenswrapper[4795]: I0310 15:28:22.952265 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerStarted","Data":"9badb3fb5863a1cb691c99822240229f3947c78d5d6fbaf6de9a806ba65b7985"} Mar 10 15:28:23 crc kubenswrapper[4795]: I0310 15:28:23.491370 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="debc5f2c-40e4-4d20-839a-cdf86203c33c" path="/var/lib/kubelet/pods/debc5f2c-40e4-4d20-839a-cdf86203c33c/volumes" Mar 10 15:28:23 crc kubenswrapper[4795]: I0310 15:28:23.966433 4795 generic.go:334] "Generic (PLEG): container finished" podID="264227be-e2df-4b25-bfff-14226b9f6703" containerID="83cd94d127b49631eeacb75b13436fbf19766ce9c3251ca5bd728af1e189b433" exitCode=0 Mar 10 15:28:23 crc kubenswrapper[4795]: I0310 15:28:23.966494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rfcwp" event={"ID":"264227be-e2df-4b25-bfff-14226b9f6703","Type":"ContainerDied","Data":"83cd94d127b49631eeacb75b13436fbf19766ce9c3251ca5bd728af1e189b433"} Mar 10 15:28:23 crc kubenswrapper[4795]: I0310 15:28:23.968498 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerStarted","Data":"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197"} Mar 10 15:28:23 crc kubenswrapper[4795]: I0310 15:28:23.968533 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerStarted","Data":"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90"} Mar 10 15:28:24 crc kubenswrapper[4795]: I0310 15:28:24.008890 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.008871925 podStartE2EDuration="3.008871925s" podCreationTimestamp="2026-03-10 15:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:24.001788392 +0000 UTC m=+1337.167529290" watchObservedRunningTime="2026-03-10 15:28:24.008871925 +0000 UTC m=+1337.174612823" Mar 10 15:28:24 crc kubenswrapper[4795]: I0310 15:28:24.984127 4795 generic.go:334] "Generic (PLEG): container finished" podID="d1a0ed63-687e-4e41-8c99-e299cb991e17" containerID="313c53deb402c3cc0fa68eb9bd3743342a94414c83f34f32ac38395591172e75" exitCode=0 Mar 10 15:28:24 crc kubenswrapper[4795]: I0310 15:28:24.984184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" event={"ID":"d1a0ed63-687e-4e41-8c99-e299cb991e17","Type":"ContainerDied","Data":"313c53deb402c3cc0fa68eb9bd3743342a94414c83f34f32ac38395591172e75"} Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.152953 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.153010 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.307521 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.307879 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.343902 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.424269 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.456337 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.540121 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.540975 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="dnsmasq-dns" containerID="cri-o://a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591" gracePeriod=10 Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.565795 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.748106 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts\") pod \"264227be-e2df-4b25-bfff-14226b9f6703\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.748218 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4pjp\" (UniqueName: \"kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp\") pod \"264227be-e2df-4b25-bfff-14226b9f6703\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.748277 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data\") pod \"264227be-e2df-4b25-bfff-14226b9f6703\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.748340 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle\") pod \"264227be-e2df-4b25-bfff-14226b9f6703\" (UID: \"264227be-e2df-4b25-bfff-14226b9f6703\") " Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.754965 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp" (OuterVolumeSpecName: "kube-api-access-v4pjp") pod "264227be-e2df-4b25-bfff-14226b9f6703" (UID: "264227be-e2df-4b25-bfff-14226b9f6703"). InnerVolumeSpecName "kube-api-access-v4pjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.756580 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts" (OuterVolumeSpecName: "scripts") pod "264227be-e2df-4b25-bfff-14226b9f6703" (UID: "264227be-e2df-4b25-bfff-14226b9f6703"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.810213 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "264227be-e2df-4b25-bfff-14226b9f6703" (UID: "264227be-e2df-4b25-bfff-14226b9f6703"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.821738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data" (OuterVolumeSpecName: "config-data") pod "264227be-e2df-4b25-bfff-14226b9f6703" (UID: "264227be-e2df-4b25-bfff-14226b9f6703"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.851324 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.851387 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.851445 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/264227be-e2df-4b25-bfff-14226b9f6703-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.851479 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4pjp\" (UniqueName: \"kubernetes.io/projected/264227be-e2df-4b25-bfff-14226b9f6703-kube-api-access-v4pjp\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:25 crc kubenswrapper[4795]: I0310 15:28:25.993216 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.000815 4795 generic.go:334] "Generic (PLEG): container finished" podID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerID="a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591" exitCode=0 Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.000877 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" event={"ID":"63613082-2a89-4b47-b33e-c1851b7b95fe","Type":"ContainerDied","Data":"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591"} Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.000904 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" event={"ID":"63613082-2a89-4b47-b33e-c1851b7b95fe","Type":"ContainerDied","Data":"82c8d3f8e2815fa1e27bca840139d0a7fed3c07831029390919f07defe3d6a7f"} Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.000920 4795 scope.go:117] "RemoveContainer" containerID="a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.007278 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-rfcwp" event={"ID":"264227be-e2df-4b25-bfff-14226b9f6703","Type":"ContainerDied","Data":"f3f8fbb3483aca6d0e31f5921898fa15db8baef2a6aaf3e7da3cef7adeeb7330"} Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.007331 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3f8fbb3483aca6d0e31f5921898fa15db8baef2a6aaf3e7da3cef7adeeb7330" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.007433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-rfcwp" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.048321 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.060125 4795 scope.go:117] "RemoveContainer" containerID="0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.137195 4795 scope.go:117] "RemoveContainer" containerID="a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591" Mar 10 15:28:26 crc kubenswrapper[4795]: E0310 15:28:26.137628 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591\": container with ID starting with a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591 not found: ID does not exist" containerID="a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.137679 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591"} err="failed to get container status \"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591\": rpc error: code = NotFound desc = could not find container \"a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591\": container with ID starting with a80e3e008adc2fb4f69492279028e47bde00566f0ecf16954fbe9305db71f591 not found: ID does not exist" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.137706 4795 scope.go:117] "RemoveContainer" containerID="0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275" Mar 10 15:28:26 crc kubenswrapper[4795]: E0310 15:28:26.140926 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275\": container with ID starting with 0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275 not found: ID does not exist" containerID="0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.140969 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275"} err="failed to get container status \"0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275\": rpc error: code = NotFound desc = could not find container \"0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275\": container with ID starting with 0b66d62478053796b463254c7b7461109d47c076b280295c44c3fad440e5a275 not found: ID does not exist" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.140927 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.141177 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-log" containerID="cri-o://07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b" gracePeriod=30 Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.141548 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-api" containerID="cri-o://fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d" gracePeriod=30 Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.146744 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": EOF" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.146793 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": EOF" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161211 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161315 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161339 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161418 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161528 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.161570 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmldx\" (UniqueName: \"kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx\") pod \"63613082-2a89-4b47-b33e-c1851b7b95fe\" (UID: \"63613082-2a89-4b47-b33e-c1851b7b95fe\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.166625 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.166857 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-log" containerID="cri-o://eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" gracePeriod=30 Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.167100 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx" (OuterVolumeSpecName: "kube-api-access-kmldx") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "kube-api-access-kmldx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.167322 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-metadata" containerID="cri-o://f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" gracePeriod=30 Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.260207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.264408 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.264437 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmldx\" (UniqueName: \"kubernetes.io/projected/63613082-2a89-4b47-b33e-c1851b7b95fe-kube-api-access-kmldx\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.265345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.277215 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config" (OuterVolumeSpecName: "config") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.278617 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.279965 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "63613082-2a89-4b47-b33e-c1851b7b95fe" (UID: "63613082-2a89-4b47-b33e-c1851b7b95fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.366410 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.366439 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.366451 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.366460 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/63613082-2a89-4b47-b33e-c1851b7b95fe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.520599 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.586772 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.673487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsgxm\" (UniqueName: \"kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm\") pod \"d1a0ed63-687e-4e41-8c99-e299cb991e17\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.673593 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle\") pod \"d1a0ed63-687e-4e41-8c99-e299cb991e17\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.673853 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts\") pod \"d1a0ed63-687e-4e41-8c99-e299cb991e17\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.674011 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data\") pod \"d1a0ed63-687e-4e41-8c99-e299cb991e17\" (UID: \"d1a0ed63-687e-4e41-8c99-e299cb991e17\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.679184 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm" (OuterVolumeSpecName: "kube-api-access-qsgxm") pod "d1a0ed63-687e-4e41-8c99-e299cb991e17" (UID: "d1a0ed63-687e-4e41-8c99-e299cb991e17"). InnerVolumeSpecName "kube-api-access-qsgxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.690212 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts" (OuterVolumeSpecName: "scripts") pod "d1a0ed63-687e-4e41-8c99-e299cb991e17" (UID: "d1a0ed63-687e-4e41-8c99-e299cb991e17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.701208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data" (OuterVolumeSpecName: "config-data") pod "d1a0ed63-687e-4e41-8c99-e299cb991e17" (UID: "d1a0ed63-687e-4e41-8c99-e299cb991e17"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.715369 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1a0ed63-687e-4e41-8c99-e299cb991e17" (UID: "d1a0ed63-687e-4e41-8c99-e299cb991e17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.776574 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.776617 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.776628 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsgxm\" (UniqueName: \"kubernetes.io/projected/d1a0ed63-687e-4e41-8c99-e299cb991e17-kube-api-access-qsgxm\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.776641 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1a0ed63-687e-4e41-8c99-e299cb991e17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.784934 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.877317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data\") pod \"f407307e-9a57-4383-a1df-d87c5adb519b\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.877868 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs\") pod \"f407307e-9a57-4383-a1df-d87c5adb519b\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.877893 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctth4\" (UniqueName: \"kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4\") pod \"f407307e-9a57-4383-a1df-d87c5adb519b\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.877934 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle\") pod \"f407307e-9a57-4383-a1df-d87c5adb519b\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.877965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs\") pod \"f407307e-9a57-4383-a1df-d87c5adb519b\" (UID: \"f407307e-9a57-4383-a1df-d87c5adb519b\") " Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.878448 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs" (OuterVolumeSpecName: "logs") pod "f407307e-9a57-4383-a1df-d87c5adb519b" (UID: "f407307e-9a57-4383-a1df-d87c5adb519b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.883854 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4" (OuterVolumeSpecName: "kube-api-access-ctth4") pod "f407307e-9a57-4383-a1df-d87c5adb519b" (UID: "f407307e-9a57-4383-a1df-d87c5adb519b"). InnerVolumeSpecName "kube-api-access-ctth4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.911308 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data" (OuterVolumeSpecName: "config-data") pod "f407307e-9a57-4383-a1df-d87c5adb519b" (UID: "f407307e-9a57-4383-a1df-d87c5adb519b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.917184 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f407307e-9a57-4383-a1df-d87c5adb519b" (UID: "f407307e-9a57-4383-a1df-d87c5adb519b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.932881 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f407307e-9a57-4383-a1df-d87c5adb519b" (UID: "f407307e-9a57-4383-a1df-d87c5adb519b"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.979793 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctth4\" (UniqueName: \"kubernetes.io/projected/f407307e-9a57-4383-a1df-d87c5adb519b-kube-api-access-ctth4\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.979861 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.979875 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.979887 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f407307e-9a57-4383-a1df-d87c5adb519b-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:26 crc kubenswrapper[4795]: I0310 15:28:26.979898 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f407307e-9a57-4383-a1df-d87c5adb519b-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.033802 4795 generic.go:334] "Generic (PLEG): container finished" podID="f407307e-9a57-4383-a1df-d87c5adb519b" containerID="f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" exitCode=0 Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.033860 4795 generic.go:334] "Generic (PLEG): container finished" podID="f407307e-9a57-4383-a1df-d87c5adb519b" containerID="eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" exitCode=143 Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.033941 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerDied","Data":"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197"} Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.034018 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.034355 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerDied","Data":"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90"} Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.034375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f407307e-9a57-4383-a1df-d87c5adb519b","Type":"ContainerDied","Data":"9badb3fb5863a1cb691c99822240229f3947c78d5d6fbaf6de9a806ba65b7985"} Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.034390 4795 scope.go:117] "RemoveContainer" containerID="f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.036559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" event={"ID":"d1a0ed63-687e-4e41-8c99-e299cb991e17","Type":"ContainerDied","Data":"94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf"} Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.036614 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94690ddc2fc61581b4375c8926338153088d7999496a295a7f978658312e39bf" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.039991 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-tkhtd" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.047132 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-wwmtp" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.057375 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerDied","Data":"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b"} Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.057441 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerID="07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b" exitCode=143 Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.065350 4795 scope.go:117] "RemoveContainer" containerID="eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.110009 4795 scope.go:117] "RemoveContainer" containerID="f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.110161 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.110674 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-log" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.110690 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-log" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.110731 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a0ed63-687e-4e41-8c99-e299cb991e17" containerName="nova-cell1-conductor-db-sync" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.110737 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a0ed63-687e-4e41-8c99-e299cb991e17" containerName="nova-cell1-conductor-db-sync" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.110762 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="init" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.110767 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="init" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.111631 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="dnsmasq-dns" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.111644 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="dnsmasq-dns" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.111663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-metadata" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.111668 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-metadata" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.111694 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="264227be-e2df-4b25-bfff-14226b9f6703" containerName="nova-manage" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.111700 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="264227be-e2df-4b25-bfff-14226b9f6703" containerName="nova-manage" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.112372 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a0ed63-687e-4e41-8c99-e299cb991e17" containerName="nova-cell1-conductor-db-sync" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.112394 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-metadata" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.112409 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" containerName="dnsmasq-dns" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.112419 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" containerName="nova-metadata-log" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.112427 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="264227be-e2df-4b25-bfff-14226b9f6703" containerName="nova-manage" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.115904 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.118365 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197\": container with ID starting with f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197 not found: ID does not exist" containerID="f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.118412 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197"} err="failed to get container status \"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197\": rpc error: code = NotFound desc = could not find container \"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197\": container with ID starting with f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197 not found: ID does not exist" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.118440 4795 scope.go:117] "RemoveContainer" containerID="eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" Mar 10 15:28:27 crc kubenswrapper[4795]: E0310 15:28:27.122015 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90\": container with ID starting with eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90 not found: ID does not exist" containerID="eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.122081 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90"} err="failed to get container status \"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90\": rpc error: code = NotFound desc = could not find container \"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90\": container with ID starting with eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90 not found: ID does not exist" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.122105 4795 scope.go:117] "RemoveContainer" containerID="f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.122373 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.123484 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.124102 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197"} err="failed to get container status \"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197\": rpc error: code = NotFound desc = could not find container \"f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197\": container with ID starting with f23dc0547648f8ae8a41b9230bdee0b40e7ebce6556ff4ed8d73be050fb7f197 not found: ID does not exist" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.124133 4795 scope.go:117] "RemoveContainer" containerID="eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.124634 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90"} err="failed to get container status \"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90\": rpc error: code = NotFound desc = could not find container \"eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90\": container with ID starting with eaf5b8f6336385590bc6d2836795dfe471a8bbd7680d8f4749b3c42abfba3f90 not found: ID does not exist" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.135263 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-wwmtp"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.144522 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.157850 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.177134 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.203717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.204116 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rmcm\" (UniqueName: \"kubernetes.io/projected/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-kube-api-access-9rmcm\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.204204 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.208133 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.209912 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.211817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.217149 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.219129 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305451 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305515 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305574 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9k6j\" (UniqueName: \"kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rmcm\" (UniqueName: \"kubernetes.io/projected/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-kube-api-access-9rmcm\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305661 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305705 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.305737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.311105 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.313581 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.324861 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rmcm\" (UniqueName: \"kubernetes.io/projected/7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d-kube-api-access-9rmcm\") pod \"nova-cell1-conductor-0\" (UID: \"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d\") " pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.407443 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9k6j\" (UniqueName: \"kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.407560 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.407600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.407706 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.407734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.408348 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.411528 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.411814 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.412198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.429300 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9k6j\" (UniqueName: \"kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j\") pod \"nova-metadata-0\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " pod="openstack/nova-metadata-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.452342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.501124 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63613082-2a89-4b47-b33e-c1851b7b95fe" path="/var/lib/kubelet/pods/63613082-2a89-4b47-b33e-c1851b7b95fe/volumes" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.501891 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f407307e-9a57-4383-a1df-d87c5adb519b" path="/var/lib/kubelet/pods/f407307e-9a57-4383-a1df-d87c5adb519b/volumes" Mar 10 15:28:27 crc kubenswrapper[4795]: I0310 15:28:27.534218 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:28:28 crc kubenswrapper[4795]: I0310 15:28:28.055469 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 10 15:28:28 crc kubenswrapper[4795]: W0310 15:28:28.064529 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3cf6cb_a222_4fe9_a40d_9e4cf686f58d.slice/crio-6f84d486e605617a643252e7e29f87db6d3632b61557fc2691b333d911be1dd4 WatchSource:0}: Error finding container 6f84d486e605617a643252e7e29f87db6d3632b61557fc2691b333d911be1dd4: Status 404 returned error can't find the container with id 6f84d486e605617a643252e7e29f87db6d3632b61557fc2691b333d911be1dd4 Mar 10 15:28:28 crc kubenswrapper[4795]: I0310 15:28:28.065353 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerName="nova-scheduler-scheduler" containerID="cri-o://4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" gracePeriod=30 Mar 10 15:28:28 crc kubenswrapper[4795]: I0310 15:28:28.130512 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.080500 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerStarted","Data":"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573"} Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.080887 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerStarted","Data":"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61"} Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.080907 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerStarted","Data":"2301569800a20223c16717e9c179771ef17e86d8812fa86c15993cd81ad14546"} Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.084326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d","Type":"ContainerStarted","Data":"6c3a1006667ed294f2ff53304f7ed134ae0b5e1aefc815b71ea252e0c446eca5"} Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.084360 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d","Type":"ContainerStarted","Data":"6f84d486e605617a643252e7e29f87db6d3632b61557fc2691b333d911be1dd4"} Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.084523 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.135158 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.135137298 podStartE2EDuration="2.135137298s" podCreationTimestamp="2026-03-10 15:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:29.126769518 +0000 UTC m=+1342.292510436" watchObservedRunningTime="2026-03-10 15:28:29.135137298 +0000 UTC m=+1342.300878196" Mar 10 15:28:29 crc kubenswrapper[4795]: I0310 15:28:29.137233 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.1372255669999998 podStartE2EDuration="2.137225567s" podCreationTimestamp="2026-03-10 15:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:29.113505778 +0000 UTC m=+1342.279246676" watchObservedRunningTime="2026-03-10 15:28:29.137225567 +0000 UTC m=+1342.302966465" Mar 10 15:28:30 crc kubenswrapper[4795]: E0310 15:28:30.309485 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:28:30 crc kubenswrapper[4795]: E0310 15:28:30.311597 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:28:30 crc kubenswrapper[4795]: E0310 15:28:30.313241 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:28:30 crc kubenswrapper[4795]: E0310 15:28:30.313328 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerName="nova-scheduler-scheduler" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.700652 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.790322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgc5h\" (UniqueName: \"kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h\") pod \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.790503 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data\") pod \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.790568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle\") pod \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\" (UID: \"4a0de041-d7c3-4b89-bd34-5af2dda7539e\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.797999 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h" (OuterVolumeSpecName: "kube-api-access-rgc5h") pod "4a0de041-d7c3-4b89-bd34-5af2dda7539e" (UID: "4a0de041-d7c3-4b89-bd34-5af2dda7539e"). InnerVolumeSpecName "kube-api-access-rgc5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.818184 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data" (OuterVolumeSpecName: "config-data") pod "4a0de041-d7c3-4b89-bd34-5af2dda7539e" (UID: "4a0de041-d7c3-4b89-bd34-5af2dda7539e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.825726 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a0de041-d7c3-4b89-bd34-5af2dda7539e" (UID: "4a0de041-d7c3-4b89-bd34-5af2dda7539e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.893398 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgc5h\" (UniqueName: \"kubernetes.io/projected/4a0de041-d7c3-4b89-bd34-5af2dda7539e-kube-api-access-rgc5h\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.893427 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.893437 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a0de041-d7c3-4b89-bd34-5af2dda7539e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.902309 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994037 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs\") pod \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994101 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fhfm\" (UniqueName: \"kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm\") pod \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994246 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data\") pod \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994271 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle\") pod \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\" (UID: \"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847\") " Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994458 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs" (OuterVolumeSpecName: "logs") pod "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" (UID: "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.994688 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:31 crc kubenswrapper[4795]: I0310 15:28:31.997087 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm" (OuterVolumeSpecName: "kube-api-access-7fhfm") pod "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" (UID: "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847"). InnerVolumeSpecName "kube-api-access-7fhfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.025218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data" (OuterVolumeSpecName: "config-data") pod "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" (UID: "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.049209 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" (UID: "ec8800c2-ae4b-42fa-bcf4-56eaf7c26847"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.096390 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fhfm\" (UniqueName: \"kubernetes.io/projected/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-kube-api-access-7fhfm\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.096442 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.096453 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.110496 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" exitCode=0 Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.110550 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.110561 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a0de041-d7c3-4b89-bd34-5af2dda7539e","Type":"ContainerDied","Data":"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd"} Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.110584 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4a0de041-d7c3-4b89-bd34-5af2dda7539e","Type":"ContainerDied","Data":"64569b9b22e7e3f7ebc67a1a33f5b752ffae5776bbe51318d4a9fca68e821dec"} Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.110599 4795 scope.go:117] "RemoveContainer" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.112960 4795 generic.go:334] "Generic (PLEG): container finished" podID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerID="fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d" exitCode=0 Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.113007 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerDied","Data":"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d"} Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.113032 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ec8800c2-ae4b-42fa-bcf4-56eaf7c26847","Type":"ContainerDied","Data":"9ec5a27314da3e1d73ed839c84a93df6f36d019e0e18566ec6cb607176803bf2"} Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.113098 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.143596 4795 scope.go:117] "RemoveContainer" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.147150 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd\": container with ID starting with 4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd not found: ID does not exist" containerID="4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.147207 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd"} err="failed to get container status \"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd\": rpc error: code = NotFound desc = could not find container \"4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd\": container with ID starting with 4c22d966168a851f8950beb0269515530b52badb18785f1420fa820b555f60fd not found: ID does not exist" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.147239 4795 scope.go:117] "RemoveContainer" containerID="fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.173321 4795 scope.go:117] "RemoveContainer" containerID="07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.186209 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.199832 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.209891 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.269021 4795 scope.go:117] "RemoveContainer" containerID="fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d" Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.285303 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d\": container with ID starting with fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d not found: ID does not exist" containerID="fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.285419 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d"} err="failed to get container status \"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d\": rpc error: code = NotFound desc = could not find container \"fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d\": container with ID starting with fa11a6a394a76e083e80ec643a1811a055fd9b49c19023623996052aaaca2b7d not found: ID does not exist" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.285459 4795 scope.go:117] "RemoveContainer" containerID="07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b" Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.296303 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b\": container with ID starting with 07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b not found: ID does not exist" containerID="07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.296401 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b"} err="failed to get container status \"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b\": rpc error: code = NotFound desc = could not find container \"07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b\": container with ID starting with 07d000524a77965e60cbb1613d0416e6464e64280a926f5b28530587e6c1875b not found: ID does not exist" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.296470 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.313469 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.315434 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-log" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.315475 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-log" Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.315522 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-api" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.315530 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-api" Mar 10 15:28:32 crc kubenswrapper[4795]: E0310 15:28:32.315581 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerName="nova-scheduler-scheduler" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.315588 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerName="nova-scheduler-scheduler" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.316028 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" containerName="nova-scheduler-scheduler" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.316053 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-log" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.319129 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" containerName="nova-api-api" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.320094 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.337940 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.342377 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.342726 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.344139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.347791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.375119 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412109 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412210 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412234 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412300 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412321 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkblz\" (UniqueName: \"kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.412347 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phdpm\" (UniqueName: \"kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514182 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514225 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkblz\" (UniqueName: \"kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514261 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phdpm\" (UniqueName: \"kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514335 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514401 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.514421 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.517435 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.521648 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.527738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.528646 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.535429 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.535668 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.538845 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.542606 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phdpm\" (UniqueName: \"kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm\") pod \"nova-api-0\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " pod="openstack/nova-api-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.546277 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkblz\" (UniqueName: \"kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz\") pod \"nova-scheduler-0\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.659302 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:28:32 crc kubenswrapper[4795]: I0310 15:28:32.670361 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:28:33 crc kubenswrapper[4795]: I0310 15:28:33.154185 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:28:33 crc kubenswrapper[4795]: W0310 15:28:33.154566 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7145813_10b4_4328_9926_429b17af8f0e.slice/crio-858aa69c44497534fd42bb0cb1241a72ab2f32e8c999b11a5aac2a2cc622a296 WatchSource:0}: Error finding container 858aa69c44497534fd42bb0cb1241a72ab2f32e8c999b11a5aac2a2cc622a296: Status 404 returned error can't find the container with id 858aa69c44497534fd42bb0cb1241a72ab2f32e8c999b11a5aac2a2cc622a296 Mar 10 15:28:33 crc kubenswrapper[4795]: W0310 15:28:33.262461 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode69383ee_8bd7_4df8_aaef_3bbc6d3a4711.slice/crio-f03cd7b7a799124433bb47197ca76433f392dc18fbf12137eb063f935da189b3 WatchSource:0}: Error finding container f03cd7b7a799124433bb47197ca76433f392dc18fbf12137eb063f935da189b3: Status 404 returned error can't find the container with id f03cd7b7a799124433bb47197ca76433f392dc18fbf12137eb063f935da189b3 Mar 10 15:28:33 crc kubenswrapper[4795]: I0310 15:28:33.268786 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:33 crc kubenswrapper[4795]: I0310 15:28:33.506237 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a0de041-d7c3-4b89-bd34-5af2dda7539e" path="/var/lib/kubelet/pods/4a0de041-d7c3-4b89-bd34-5af2dda7539e/volumes" Mar 10 15:28:33 crc kubenswrapper[4795]: I0310 15:28:33.507823 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec8800c2-ae4b-42fa-bcf4-56eaf7c26847" path="/var/lib/kubelet/pods/ec8800c2-ae4b-42fa-bcf4-56eaf7c26847/volumes" Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.148030 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7145813-10b4-4328-9926-429b17af8f0e","Type":"ContainerStarted","Data":"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6"} Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.148141 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7145813-10b4-4328-9926-429b17af8f0e","Type":"ContainerStarted","Data":"858aa69c44497534fd42bb0cb1241a72ab2f32e8c999b11a5aac2a2cc622a296"} Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.150581 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerStarted","Data":"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4"} Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.150668 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerStarted","Data":"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934"} Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.150690 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerStarted","Data":"f03cd7b7a799124433bb47197ca76433f392dc18fbf12137eb063f935da189b3"} Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.174585 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.174560645 podStartE2EDuration="2.174560645s" podCreationTimestamp="2026-03-10 15:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:34.173273379 +0000 UTC m=+1347.339014307" watchObservedRunningTime="2026-03-10 15:28:34.174560645 +0000 UTC m=+1347.340301563" Mar 10 15:28:34 crc kubenswrapper[4795]: I0310 15:28:34.198703 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.198680566 podStartE2EDuration="2.198680566s" podCreationTimestamp="2026-03-10 15:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:34.197745559 +0000 UTC m=+1347.363486487" watchObservedRunningTime="2026-03-10 15:28:34.198680566 +0000 UTC m=+1347.364421464" Mar 10 15:28:37 crc kubenswrapper[4795]: I0310 15:28:37.496131 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 10 15:28:37 crc kubenswrapper[4795]: I0310 15:28:37.534744 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:28:37 crc kubenswrapper[4795]: I0310 15:28:37.534795 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:28:37 crc kubenswrapper[4795]: I0310 15:28:37.659719 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:28:38 crc kubenswrapper[4795]: I0310 15:28:38.551298 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:28:38 crc kubenswrapper[4795]: I0310 15:28:38.551319 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:28:41 crc kubenswrapper[4795]: I0310 15:28:41.991979 4795 scope.go:117] "RemoveContainer" containerID="e2375420575b7940817451a4f11faa3d001564ebc848513489f589cf30f357a6" Mar 10 15:28:42 crc kubenswrapper[4795]: I0310 15:28:42.659988 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:28:42 crc kubenswrapper[4795]: I0310 15:28:42.671034 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:28:42 crc kubenswrapper[4795]: I0310 15:28:42.671158 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:28:42 crc kubenswrapper[4795]: I0310 15:28:42.694014 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:28:43 crc kubenswrapper[4795]: I0310 15:28:43.205990 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 15:28:43 crc kubenswrapper[4795]: I0310 15:28:43.286625 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:28:43 crc kubenswrapper[4795]: I0310 15:28:43.752804 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:28:43 crc kubenswrapper[4795]: I0310 15:28:43.753338 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.204:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 10 15:28:46 crc kubenswrapper[4795]: I0310 15:28:46.982335 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:46 crc kubenswrapper[4795]: I0310 15:28:46.987119 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="9095d427-1630-4118-881c-eca71ebf01dc" containerName="kube-state-metrics" containerID="cri-o://ab8f1913a85a8307639a03ea12df17dcb12f506626cc4c85c89f9ee2e7880154" gracePeriod=30 Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.316266 4795 generic.go:334] "Generic (PLEG): container finished" podID="9095d427-1630-4118-881c-eca71ebf01dc" containerID="ab8f1913a85a8307639a03ea12df17dcb12f506626cc4c85c89f9ee2e7880154" exitCode=2 Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.316439 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9095d427-1630-4118-881c-eca71ebf01dc","Type":"ContainerDied","Data":"ab8f1913a85a8307639a03ea12df17dcb12f506626cc4c85c89f9ee2e7880154"} Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.541630 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.544479 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.545472 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.568694 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.609167 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md4b2\" (UniqueName: \"kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2\") pod \"9095d427-1630-4118-881c-eca71ebf01dc\" (UID: \"9095d427-1630-4118-881c-eca71ebf01dc\") " Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.617126 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2" (OuterVolumeSpecName: "kube-api-access-md4b2") pod "9095d427-1630-4118-881c-eca71ebf01dc" (UID: "9095d427-1630-4118-881c-eca71ebf01dc"). InnerVolumeSpecName "kube-api-access-md4b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:47 crc kubenswrapper[4795]: I0310 15:28:47.711683 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md4b2\" (UniqueName: \"kubernetes.io/projected/9095d427-1630-4118-881c-eca71ebf01dc-kube-api-access-md4b2\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.329872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"9095d427-1630-4118-881c-eca71ebf01dc","Type":"ContainerDied","Data":"28f3dbc3e1226ac07706955a95e36c25b3ca9fc1767580d98b20a2dcd26dc6e1"} Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.329935 4795 scope.go:117] "RemoveContainer" containerID="ab8f1913a85a8307639a03ea12df17dcb12f506626cc4c85c89f9ee2e7880154" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.329980 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.334774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.404934 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.415580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.447300 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:48 crc kubenswrapper[4795]: E0310 15:28:48.448112 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9095d427-1630-4118-881c-eca71ebf01dc" containerName="kube-state-metrics" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.448134 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9095d427-1630-4118-881c-eca71ebf01dc" containerName="kube-state-metrics" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.448375 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9095d427-1630-4118-881c-eca71ebf01dc" containerName="kube-state-metrics" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.449208 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.468783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.468996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.474947 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.535287 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.535357 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.535411 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cssh\" (UniqueName: \"kubernetes.io/projected/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-api-access-2cssh\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.535530 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.649197 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.649264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.649313 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cssh\" (UniqueName: \"kubernetes.io/projected/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-api-access-2cssh\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.649397 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.654711 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.665859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.667675 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.773199 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cssh\" (UniqueName: \"kubernetes.io/projected/f21f73ad-9726-44ae-a239-21adfeb80d1f-kube-api-access-2cssh\") pod \"kube-state-metrics-0\" (UID: \"f21f73ad-9726-44ae-a239-21adfeb80d1f\") " pod="openstack/kube-state-metrics-0" Mar 10 15:28:48 crc kubenswrapper[4795]: I0310 15:28:48.791982 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.265304 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.339159 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f21f73ad-9726-44ae-a239-21adfeb80d1f","Type":"ContainerStarted","Data":"295c21118bdbe148ae859f60187f0d9ddd90e8b4ad85fbb21d6b700cd7cf3ed8"} Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.386466 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.386764 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-central-agent" containerID="cri-o://e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771" gracePeriod=30 Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.387366 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-notification-agent" containerID="cri-o://6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81" gracePeriod=30 Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.387376 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="sg-core" containerID="cri-o://8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a" gracePeriod=30 Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.387453 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="proxy-httpd" containerID="cri-o://9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0" gracePeriod=30 Mar 10 15:28:49 crc kubenswrapper[4795]: I0310 15:28:49.488597 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9095d427-1630-4118-881c-eca71ebf01dc" path="/var/lib/kubelet/pods/9095d427-1630-4118-881c-eca71ebf01dc/volumes" Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.355115 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"f21f73ad-9726-44ae-a239-21adfeb80d1f","Type":"ContainerStarted","Data":"34b8571ce358f6ff6c6173d0a498444aa978ef280866b7ceaef3f40ba0dca08f"} Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.355403 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358781 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerID="9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0" exitCode=0 Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358809 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerID="8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a" exitCode=2 Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358819 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerID="e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771" exitCode=0 Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358863 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerDied","Data":"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0"} Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358900 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerDied","Data":"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a"} Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.358954 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerDied","Data":"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771"} Mar 10 15:28:50 crc kubenswrapper[4795]: I0310 15:28:50.374461 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.953196949 podStartE2EDuration="2.374439547s" podCreationTimestamp="2026-03-10 15:28:48 +0000 UTC" firstStartedPulling="2026-03-10 15:28:49.271293859 +0000 UTC m=+1362.437034757" lastFinishedPulling="2026-03-10 15:28:49.692536457 +0000 UTC m=+1362.858277355" observedRunningTime="2026-03-10 15:28:50.37209574 +0000 UTC m=+1363.537836638" watchObservedRunningTime="2026-03-10 15:28:50.374439547 +0000 UTC m=+1363.540180435" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.371934 4795 generic.go:334] "Generic (PLEG): container finished" podID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" containerID="1a9c9b167e823eb5c00e5d22c93b25d39a5e09ba225d873cf169298953242506" exitCode=137 Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.371991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e4c4f75-b00b-414d-89c0-05016ae6fa92","Type":"ContainerDied","Data":"1a9c9b167e823eb5c00e5d22c93b25d39a5e09ba225d873cf169298953242506"} Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.372579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3e4c4f75-b00b-414d-89c0-05016ae6fa92","Type":"ContainerDied","Data":"5b255c7ac968a4fe67e2b8d32a7380e773eb4b56858c9cbc1c4c481b644ded00"} Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.372599 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b255c7ac968a4fe67e2b8d32a7380e773eb4b56858c9cbc1c4c481b644ded00" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.382785 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.423899 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data\") pod \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.424054 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6jqn\" (UniqueName: \"kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn\") pod \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.424192 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle\") pod \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\" (UID: \"3e4c4f75-b00b-414d-89c0-05016ae6fa92\") " Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.429797 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn" (OuterVolumeSpecName: "kube-api-access-w6jqn") pod "3e4c4f75-b00b-414d-89c0-05016ae6fa92" (UID: "3e4c4f75-b00b-414d-89c0-05016ae6fa92"). InnerVolumeSpecName "kube-api-access-w6jqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.467771 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data" (OuterVolumeSpecName: "config-data") pod "3e4c4f75-b00b-414d-89c0-05016ae6fa92" (UID: "3e4c4f75-b00b-414d-89c0-05016ae6fa92"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.506768 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e4c4f75-b00b-414d-89c0-05016ae6fa92" (UID: "3e4c4f75-b00b-414d-89c0-05016ae6fa92"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.531808 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6jqn\" (UniqueName: \"kubernetes.io/projected/3e4c4f75-b00b-414d-89c0-05016ae6fa92-kube-api-access-w6jqn\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.531838 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:51 crc kubenswrapper[4795]: I0310 15:28:51.531848 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3e4c4f75-b00b-414d-89c0-05016ae6fa92-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.381209 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.506578 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.535392 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.594367 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:52 crc kubenswrapper[4795]: E0310 15:28:52.595013 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.595039 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.595385 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" containerName="nova-cell1-novncproxy-novncproxy" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.596374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.599203 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.599497 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.599945 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.640681 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.652797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.652956 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.653024 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx8ws\" (UniqueName: \"kubernetes.io/projected/62640f9a-d168-4f85-83e0-90caea1b50d4-kube-api-access-wx8ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.653341 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.653387 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.677770 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.679001 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.682082 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.683945 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.754516 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.754595 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx8ws\" (UniqueName: \"kubernetes.io/projected/62640f9a-d168-4f85-83e0-90caea1b50d4-kube-api-access-wx8ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.754816 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.754852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.754916 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.759686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.759862 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.760255 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.763447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/62640f9a-d168-4f85-83e0-90caea1b50d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.792471 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx8ws\" (UniqueName: \"kubernetes.io/projected/62640f9a-d168-4f85-83e0-90caea1b50d4-kube-api-access-wx8ws\") pod \"nova-cell1-novncproxy-0\" (UID: \"62640f9a-d168-4f85-83e0-90caea1b50d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.866277 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.923134 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.959297 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqgl\" (UniqueName: \"kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.959386 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.959437 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.960253 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.960311 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.960345 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.960399 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts\") pod \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\" (UID: \"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4\") " Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.960805 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.961690 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.962165 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.962196 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.967411 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts" (OuterVolumeSpecName: "scripts") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:52 crc kubenswrapper[4795]: I0310 15:28:52.969297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl" (OuterVolumeSpecName: "kube-api-access-xxqgl") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "kube-api-access-xxqgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.006450 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.064569 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqgl\" (UniqueName: \"kubernetes.io/projected/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-kube-api-access-xxqgl\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.064890 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.064904 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.073048 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data" (OuterVolumeSpecName: "config-data") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.087354 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" (UID: "2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.165967 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.166007 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.408091 4795 generic.go:334] "Generic (PLEG): container finished" podID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerID="6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81" exitCode=0 Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.408591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerDied","Data":"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81"} Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.408622 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.409438 4795 scope.go:117] "RemoveContainer" containerID="9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.409388 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4","Type":"ContainerDied","Data":"825dcdc8eecde51f5ce9cdf163735c3fba9cdd5dd9059634e15d62560d777b6d"} Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.410279 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.422283 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.462302 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.466540 4795 scope.go:117] "RemoveContainer" containerID="8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a" Mar 10 15:28:53 crc kubenswrapper[4795]: W0310 15:28:53.485361 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62640f9a_d168_4f85_83e0_90caea1b50d4.slice/crio-17bf2c740836d02edbf2c57a1b5abebdc93ec2fb2be350698e8d4c42f88175d4 WatchSource:0}: Error finding container 17bf2c740836d02edbf2c57a1b5abebdc93ec2fb2be350698e8d4c42f88175d4: Status 404 returned error can't find the container with id 17bf2c740836d02edbf2c57a1b5abebdc93ec2fb2be350698e8d4c42f88175d4 Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.494662 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4c4f75-b00b-414d-89c0-05016ae6fa92" path="/var/lib/kubelet/pods/3e4c4f75-b00b-414d-89c0-05016ae6fa92/volumes" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.533043 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.547526 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.560328 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.560821 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="sg-core" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.560847 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="sg-core" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.560863 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-central-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.560871 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-central-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.560885 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-notification-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.560895 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-notification-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.560904 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="proxy-httpd" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.560911 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="proxy-httpd" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.561147 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-central-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.561170 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="ceilometer-notification-agent" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.561179 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="proxy-httpd" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.561199 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" containerName="sg-core" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.570872 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.575166 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.575547 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.575919 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.578180 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.644168 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.647571 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681611 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681688 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681706 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681732 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681759 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681787 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681821 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681842 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pv6d\" (UniqueName: \"kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681865 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681907 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7v84\" (UniqueName: \"kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681925 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.681938 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.684015 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.690401 4795 scope.go:117] "RemoveContainer" containerID="6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.719142 4795 scope.go:117] "RemoveContainer" containerID="e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.773955 4795 scope.go:117] "RemoveContainer" containerID="9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.776221 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0\": container with ID starting with 9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0 not found: ID does not exist" containerID="9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.776268 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0"} err="failed to get container status \"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0\": rpc error: code = NotFound desc = could not find container \"9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0\": container with ID starting with 9ba2d5594f29f170ba0598cc2d3e17a8e47dcf5077ce6c6582fe97ef2b980de0 not found: ID does not exist" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.776301 4795 scope.go:117] "RemoveContainer" containerID="8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.778838 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a\": container with ID starting with 8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a not found: ID does not exist" containerID="8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.778877 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a"} err="failed to get container status \"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a\": rpc error: code = NotFound desc = could not find container \"8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a\": container with ID starting with 8709b1272fd1049d14868520d9587582b07928eec4b9145648cc7a2c73edd89a not found: ID does not exist" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.778902 4795 scope.go:117] "RemoveContainer" containerID="6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.779264 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81\": container with ID starting with 6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81 not found: ID does not exist" containerID="6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.779316 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81"} err="failed to get container status \"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81\": rpc error: code = NotFound desc = could not find container \"6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81\": container with ID starting with 6642b8bfbe14db33435a9826b3d88a98da80861a89786f670a53763ef601cb81 not found: ID does not exist" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.779359 4795 scope.go:117] "RemoveContainer" containerID="e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771" Mar 10 15:28:53 crc kubenswrapper[4795]: E0310 15:28:53.779625 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771\": container with ID starting with e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771 not found: ID does not exist" containerID="e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.779644 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771"} err="failed to get container status \"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771\": rpc error: code = NotFound desc = could not find container \"e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771\": container with ID starting with e733083f6d9ef043ca0d9f999b2d43ee190c08b1fde8698c4865e97bce523771 not found: ID does not exist" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785417 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785460 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785537 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785565 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785591 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785625 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785656 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785692 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785763 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pv6d\" (UniqueName: \"kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785852 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7v84\" (UniqueName: \"kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.785899 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.786029 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.787537 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.787910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.788130 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.788519 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.788666 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.788785 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.792001 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.793008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.799011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.799398 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.800530 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.807462 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pv6d\" (UniqueName: \"kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d\") pod \"ceilometer-0\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.813529 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7v84\" (UniqueName: \"kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84\") pod \"dnsmasq-dns-cd5cbd7b9-bdtrv\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.968126 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:28:53 crc kubenswrapper[4795]: I0310 15:28:53.992269 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:54.419819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"62640f9a-d168-4f85-83e0-90caea1b50d4","Type":"ContainerStarted","Data":"e60f9ca6cbd244c87f38e8090d9384f385eb003039daf33e365fe96bc060a7fc"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:54.420157 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"62640f9a-d168-4f85-83e0-90caea1b50d4","Type":"ContainerStarted","Data":"17bf2c740836d02edbf2c57a1b5abebdc93ec2fb2be350698e8d4c42f88175d4"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:54.446406 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.44638841 podStartE2EDuration="2.44638841s" podCreationTimestamp="2026-03-10 15:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:54.435783956 +0000 UTC m=+1367.601524864" watchObservedRunningTime="2026-03-10 15:28:54.44638841 +0000 UTC m=+1367.612129308" Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:54.488712 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:54.569655 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.320833 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.433572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerStarted","Data":"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.433943 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerStarted","Data":"39eee598b0b2a3af8c090ff3ba4d41ab1a2f77e07579c79952ddc836f54bfeba"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.436624 4795 generic.go:334] "Generic (PLEG): container finished" podID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerID="712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d" exitCode=0 Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.436743 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" event={"ID":"202d6bb3-868d-43da-81a0-1321d737fbc8","Type":"ContainerDied","Data":"712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.436819 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" event={"ID":"202d6bb3-868d-43da-81a0-1321d737fbc8","Type":"ContainerStarted","Data":"05a73fdbc42d5cbb896ca3ab5ecf18863726d0a0f514dd0714cbb1823c20994a"} Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.500324 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4" path="/var/lib/kubelet/pods/2b1c8dd0-a8e7-46c2-b9bc-6159f75c5eb4/volumes" Mar 10 15:28:55 crc kubenswrapper[4795]: I0310 15:28:55.719818 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:28:56 crc kubenswrapper[4795]: I0310 15:28:56.445893 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerStarted","Data":"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28"} Mar 10 15:28:56 crc kubenswrapper[4795]: I0310 15:28:56.447699 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" event={"ID":"202d6bb3-868d-43da-81a0-1321d737fbc8","Type":"ContainerStarted","Data":"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4"} Mar 10 15:28:56 crc kubenswrapper[4795]: I0310 15:28:56.447856 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-log" containerID="cri-o://400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934" gracePeriod=30 Mar 10 15:28:56 crc kubenswrapper[4795]: I0310 15:28:56.447923 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-api" containerID="cri-o://826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4" gracePeriod=30 Mar 10 15:28:56 crc kubenswrapper[4795]: I0310 15:28:56.493586 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" podStartSLOduration=3.493568211 podStartE2EDuration="3.493568211s" podCreationTimestamp="2026-03-10 15:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:28:56.488328241 +0000 UTC m=+1369.654069139" watchObservedRunningTime="2026-03-10 15:28:56.493568211 +0000 UTC m=+1369.659309109" Mar 10 15:28:57 crc kubenswrapper[4795]: I0310 15:28:57.462179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerStarted","Data":"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89"} Mar 10 15:28:57 crc kubenswrapper[4795]: I0310 15:28:57.472057 4795 generic.go:334] "Generic (PLEG): container finished" podID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerID="400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934" exitCode=143 Mar 10 15:28:57 crc kubenswrapper[4795]: I0310 15:28:57.472108 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerDied","Data":"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934"} Mar 10 15:28:57 crc kubenswrapper[4795]: I0310 15:28:57.472769 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:28:57 crc kubenswrapper[4795]: I0310 15:28:57.924105 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:28:58 crc kubenswrapper[4795]: I0310 15:28:58.807284 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.493652 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerStarted","Data":"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c"} Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.493813 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-central-agent" containerID="cri-o://66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689" gracePeriod=30 Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.494130 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.494418 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="proxy-httpd" containerID="cri-o://640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c" gracePeriod=30 Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.494467 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="sg-core" containerID="cri-o://c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89" gracePeriod=30 Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.494500 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-notification-agent" containerID="cri-o://04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28" gracePeriod=30 Mar 10 15:28:59 crc kubenswrapper[4795]: I0310 15:28:59.522376 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.686441567 podStartE2EDuration="6.522348882s" podCreationTimestamp="2026-03-10 15:28:53 +0000 UTC" firstStartedPulling="2026-03-10 15:28:54.510220397 +0000 UTC m=+1367.675961295" lastFinishedPulling="2026-03-10 15:28:58.346127702 +0000 UTC m=+1371.511868610" observedRunningTime="2026-03-10 15:28:59.515749183 +0000 UTC m=+1372.681490091" watchObservedRunningTime="2026-03-10 15:28:59.522348882 +0000 UTC m=+1372.688089780" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.199130 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.318755 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle\") pod \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.319176 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs\") pod \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.319234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data\") pod \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.319329 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phdpm\" (UniqueName: \"kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm\") pod \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\" (UID: \"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711\") " Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.319624 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs" (OuterVolumeSpecName: "logs") pod "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" (UID: "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.319857 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.324209 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm" (OuterVolumeSpecName: "kube-api-access-phdpm") pod "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" (UID: "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711"). InnerVolumeSpecName "kube-api-access-phdpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.346164 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" (UID: "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.360953 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data" (OuterVolumeSpecName: "config-data") pod "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" (UID: "e69383ee-8bd7-4df8-aaef-3bbc6d3a4711"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.421500 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.421539 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.421549 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phdpm\" (UniqueName: \"kubernetes.io/projected/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711-kube-api-access-phdpm\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518345 4795 generic.go:334] "Generic (PLEG): container finished" podID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerID="640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c" exitCode=0 Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518374 4795 generic.go:334] "Generic (PLEG): container finished" podID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerID="c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89" exitCode=2 Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518382 4795 generic.go:334] "Generic (PLEG): container finished" podID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerID="04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28" exitCode=0 Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518418 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerDied","Data":"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c"} Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerDied","Data":"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89"} Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.518454 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerDied","Data":"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28"} Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.526611 4795 generic.go:334] "Generic (PLEG): container finished" podID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerID="826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4" exitCode=0 Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.526659 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerDied","Data":"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4"} Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.526669 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.526691 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e69383ee-8bd7-4df8-aaef-3bbc6d3a4711","Type":"ContainerDied","Data":"f03cd7b7a799124433bb47197ca76433f392dc18fbf12137eb063f935da189b3"} Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.526716 4795 scope.go:117] "RemoveContainer" containerID="826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.558773 4795 scope.go:117] "RemoveContainer" containerID="400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.558923 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.571884 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.583398 4795 scope.go:117] "RemoveContainer" containerID="826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4" Mar 10 15:29:00 crc kubenswrapper[4795]: E0310 15:29:00.584887 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4\": container with ID starting with 826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4 not found: ID does not exist" containerID="826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.584982 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4"} err="failed to get container status \"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4\": rpc error: code = NotFound desc = could not find container \"826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4\": container with ID starting with 826fbb6d5f51d268b5eafd7cd65dd45a5aa41418db19946f0f116f9ccd282de4 not found: ID does not exist" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.585055 4795 scope.go:117] "RemoveContainer" containerID="400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934" Mar 10 15:29:00 crc kubenswrapper[4795]: E0310 15:29:00.585332 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934\": container with ID starting with 400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934 not found: ID does not exist" containerID="400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.585420 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934"} err="failed to get container status \"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934\": rpc error: code = NotFound desc = could not find container \"400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934\": container with ID starting with 400dfd4e2081614c676dbc9aa3404afca484d3db9026efc1d215f3470aa7f934 not found: ID does not exist" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.603042 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:00 crc kubenswrapper[4795]: E0310 15:29:00.603829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-log" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.603851 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-log" Mar 10 15:29:00 crc kubenswrapper[4795]: E0310 15:29:00.603871 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-api" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.603879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-api" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.604158 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-log" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.604193 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" containerName="nova-api-api" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.606375 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.616216 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.616602 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.616633 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.616962 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726614 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726752 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726793 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5f2c\" (UniqueName: \"kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.726825 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.829378 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.829726 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.830152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.830288 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.830372 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.830526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5f2c\" (UniqueName: \"kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.831733 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.836641 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.838057 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.840055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.846843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.851729 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5f2c\" (UniqueName: \"kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c\") pod \"nova-api-0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " pod="openstack/nova-api-0" Mar 10 15:29:00 crc kubenswrapper[4795]: I0310 15:29:00.977893 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:01 crc kubenswrapper[4795]: I0310 15:29:01.455530 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:01 crc kubenswrapper[4795]: I0310 15:29:01.488659 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69383ee-8bd7-4df8-aaef-3bbc6d3a4711" path="/var/lib/kubelet/pods/e69383ee-8bd7-4df8-aaef-3bbc6d3a4711/volumes" Mar 10 15:29:01 crc kubenswrapper[4795]: I0310 15:29:01.548632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerStarted","Data":"6c1c5eaa44b1837d251e41164ff7fb053ec3ee931aa53303e9f528312bbc7c76"} Mar 10 15:29:02 crc kubenswrapper[4795]: I0310 15:29:02.559463 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerStarted","Data":"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c"} Mar 10 15:29:02 crc kubenswrapper[4795]: I0310 15:29:02.559802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerStarted","Data":"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303"} Mar 10 15:29:02 crc kubenswrapper[4795]: I0310 15:29:02.924278 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:02 crc kubenswrapper[4795]: I0310 15:29:02.950665 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:02 crc kubenswrapper[4795]: I0310 15:29:02.982292 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.982275804 podStartE2EDuration="2.982275804s" podCreationTimestamp="2026-03-10 15:29:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:02.602666749 +0000 UTC m=+1375.768407677" watchObservedRunningTime="2026-03-10 15:29:02.982275804 +0000 UTC m=+1376.148016702" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.105792 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174206 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174614 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174732 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174742 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174828 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174854 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174904 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pv6d\" (UniqueName: \"kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174932 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.174973 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts\") pod \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\" (UID: \"a00db69e-6cfe-4af8-afc3-7f37c36daf18\") " Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.175557 4795 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.176511 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.180942 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts" (OuterVolumeSpecName: "scripts") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.187480 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d" (OuterVolumeSpecName: "kube-api-access-7pv6d") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "kube-api-access-7pv6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.209351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.246241 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.264677 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277234 4795 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a00db69e-6cfe-4af8-afc3-7f37c36daf18-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277269 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pv6d\" (UniqueName: \"kubernetes.io/projected/a00db69e-6cfe-4af8-afc3-7f37c36daf18-kube-api-access-7pv6d\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277285 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277297 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277310 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.277320 4795 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.281535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data" (OuterVolumeSpecName: "config-data") pod "a00db69e-6cfe-4af8-afc3-7f37c36daf18" (UID: "a00db69e-6cfe-4af8-afc3-7f37c36daf18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.379123 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a00db69e-6cfe-4af8-afc3-7f37c36daf18-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.575884 4795 generic.go:334] "Generic (PLEG): container finished" podID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerID="66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689" exitCode=0 Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.575960 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.575979 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerDied","Data":"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689"} Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.576041 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a00db69e-6cfe-4af8-afc3-7f37c36daf18","Type":"ContainerDied","Data":"39eee598b0b2a3af8c090ff3ba4d41ab1a2f77e07579c79952ddc836f54bfeba"} Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.576119 4795 scope.go:117] "RemoveContainer" containerID="640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.614275 4795 scope.go:117] "RemoveContainer" containerID="c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.614327 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.630561 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.652543 4795 scope.go:117] "RemoveContainer" containerID="04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.656679 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.670325 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.670975 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="proxy-httpd" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671001 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="proxy-httpd" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.671025 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-notification-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671037 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-notification-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.671091 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="sg-core" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671104 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="sg-core" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.671134 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-central-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671146 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-central-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671443 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="sg-core" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671473 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="proxy-httpd" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671501 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-central-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.671511 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" containerName="ceilometer-notification-agent" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.675490 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.681631 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.681958 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.682196 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.682738 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.693767 4795 scope.go:117] "RemoveContainer" containerID="66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789053 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-config-data\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789130 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-run-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-scripts\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789207 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789242 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789260 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns5qt\" (UniqueName: \"kubernetes.io/projected/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-kube-api-access-ns5qt\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.789296 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-log-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.835221 4795 scope.go:117] "RemoveContainer" containerID="640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.839165 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c\": container with ID starting with 640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c not found: ID does not exist" containerID="640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.839201 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c"} err="failed to get container status \"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c\": rpc error: code = NotFound desc = could not find container \"640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c\": container with ID starting with 640df394168454fc6eef7e8a3d0c028a1e315ddfdcba45c298224462d6d46a5c not found: ID does not exist" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.839222 4795 scope.go:117] "RemoveContainer" containerID="c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.840278 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89\": container with ID starting with c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89 not found: ID does not exist" containerID="c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.840303 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89"} err="failed to get container status \"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89\": rpc error: code = NotFound desc = could not find container \"c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89\": container with ID starting with c521e0a788465e1bd7a07409b497197803b27c669e723c4454f954348544ce89 not found: ID does not exist" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.840317 4795 scope.go:117] "RemoveContainer" containerID="04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.840895 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28\": container with ID starting with 04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28 not found: ID does not exist" containerID="04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.840917 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28"} err="failed to get container status \"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28\": rpc error: code = NotFound desc = could not find container \"04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28\": container with ID starting with 04b04aaf0748623c85b38020c51bcbf432c36d76e226664c2b242bbe729c1d28 not found: ID does not exist" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.840931 4795 scope.go:117] "RemoveContainer" containerID="66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689" Mar 10 15:29:03 crc kubenswrapper[4795]: E0310 15:29:03.841253 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689\": container with ID starting with 66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689 not found: ID does not exist" containerID="66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.841294 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689"} err="failed to get container status \"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689\": rpc error: code = NotFound desc = could not find container \"66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689\": container with ID starting with 66a6e377ac674dca740437414b72d352945db64345b9dbf1baf17c47cf98c689 not found: ID does not exist" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.886164 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-nnjgg"] Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.887272 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.896596 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-log-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897853 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-config-data\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897893 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-run-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897918 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-scripts\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897944 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.897966 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.898004 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.898023 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns5qt\" (UniqueName: \"kubernetes.io/projected/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-kube-api-access-ns5qt\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.898580 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-log-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.904555 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-run-httpd\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.909539 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.912084 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnjgg"] Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.913333 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.914790 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.914892 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.919716 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-scripts\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.926039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-config-data\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.934878 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns5qt\" (UniqueName: \"kubernetes.io/projected/7066ac26-5bcb-472a-ba10-c8e08af7f0b3-kube-api-access-ns5qt\") pod \"ceilometer-0\" (UID: \"7066ac26-5bcb-472a-ba10-c8e08af7f0b3\") " pod="openstack/ceilometer-0" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.993221 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.999104 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.999148 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.999185 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:03 crc kubenswrapper[4795]: I0310 15:29:03.999229 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x4lw\" (UniqueName: \"kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.019777 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.099343 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.100291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.100376 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x4lw\" (UniqueName: \"kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.100488 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.100513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.100956 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="dnsmasq-dns" containerID="cri-o://23fc78b8ef704320c66fcd5bc9599db62c5fce5b6de059470ca85a5f59b13811" gracePeriod=10 Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.104441 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.112275 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.117771 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.126170 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x4lw\" (UniqueName: \"kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw\") pod \"nova-cell1-cell-mapping-nnjgg\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.313210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.544619 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.591822 4795 generic.go:334] "Generic (PLEG): container finished" podID="e98749e7-dda3-43df-af97-4b521fa4e634" containerID="23fc78b8ef704320c66fcd5bc9599db62c5fce5b6de059470ca85a5f59b13811" exitCode=0 Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.591944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" event={"ID":"e98749e7-dda3-43df-af97-4b521fa4e634","Type":"ContainerDied","Data":"23fc78b8ef704320c66fcd5bc9599db62c5fce5b6de059470ca85a5f59b13811"} Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.595908 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7066ac26-5bcb-472a-ba10-c8e08af7f0b3","Type":"ContainerStarted","Data":"6b642e8513edfc2d2f9d81cd9e8b22d5a1ee4857fb709b236345cbdc4c6e52dd"} Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.643815 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724597 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9nq2\" (UniqueName: \"kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724743 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724857 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724939 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.724965 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb\") pod \"e98749e7-dda3-43df-af97-4b521fa4e634\" (UID: \"e98749e7-dda3-43df-af97-4b521fa4e634\") " Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.744426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2" (OuterVolumeSpecName: "kube-api-access-v9nq2") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "kube-api-access-v9nq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.775716 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.788174 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config" (OuterVolumeSpecName: "config") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.788594 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.803376 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnjgg"] Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.808693 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.816455 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e98749e7-dda3-43df-af97-4b521fa4e634" (UID: "e98749e7-dda3-43df-af97-4b521fa4e634"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826695 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826724 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826734 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826744 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826752 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e98749e7-dda3-43df-af97-4b521fa4e634-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:04 crc kubenswrapper[4795]: I0310 15:29:04.826760 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9nq2\" (UniqueName: \"kubernetes.io/projected/e98749e7-dda3-43df-af97-4b521fa4e634-kube-api-access-v9nq2\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.492391 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00db69e-6cfe-4af8-afc3-7f37c36daf18" path="/var/lib/kubelet/pods/a00db69e-6cfe-4af8-afc3-7f37c36daf18/volumes" Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.612567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" event={"ID":"e98749e7-dda3-43df-af97-4b521fa4e634","Type":"ContainerDied","Data":"6840e02fd5ab166dd92cd7941c4d250caab3a3b2c42756898df394172c890896"} Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.612852 4795 scope.go:117] "RemoveContainer" containerID="23fc78b8ef704320c66fcd5bc9599db62c5fce5b6de059470ca85a5f59b13811" Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.612790 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-5nhjm" Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.616121 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7066ac26-5bcb-472a-ba10-c8e08af7f0b3","Type":"ContainerStarted","Data":"93a3b625f176ace944b96af871f23f2a11c367f086823fc22f80e5ff5b31cf52"} Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.621596 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnjgg" event={"ID":"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e","Type":"ContainerStarted","Data":"9b6994b2c070241525e196af6e56ba0507f837ee4bf086b63a3c23e47fc38fa3"} Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.621642 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnjgg" event={"ID":"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e","Type":"ContainerStarted","Data":"59beb03616bc094c53d370e6acddc57931a6de533c985397cddfc0f703741fc9"} Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.640333 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.648483 4795 scope.go:117] "RemoveContainer" containerID="868aefdb0fb2f3561383295a8d1fd049876743a0bdaec8f07d643efc057493fa" Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.654300 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-5nhjm"] Mar 10 15:29:05 crc kubenswrapper[4795]: I0310 15:29:05.663092 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-nnjgg" podStartSLOduration=2.6630429639999997 podStartE2EDuration="2.663042964s" podCreationTimestamp="2026-03-10 15:29:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:05.651285307 +0000 UTC m=+1378.817026225" watchObservedRunningTime="2026-03-10 15:29:05.663042964 +0000 UTC m=+1378.828783862" Mar 10 15:29:06 crc kubenswrapper[4795]: I0310 15:29:06.642713 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7066ac26-5bcb-472a-ba10-c8e08af7f0b3","Type":"ContainerStarted","Data":"6475e7e4f6291a7178867ac6bf3478cf652324d547e2bd845840aebfbf05be27"} Mar 10 15:29:06 crc kubenswrapper[4795]: I0310 15:29:06.643038 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7066ac26-5bcb-472a-ba10-c8e08af7f0b3","Type":"ContainerStarted","Data":"f5cd50dd1d36c2f5dcb92a2238659d10547b06a0e33e48544373c7de65909340"} Mar 10 15:29:07 crc kubenswrapper[4795]: I0310 15:29:07.504771 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" path="/var/lib/kubelet/pods/e98749e7-dda3-43df-af97-4b521fa4e634/volumes" Mar 10 15:29:09 crc kubenswrapper[4795]: I0310 15:29:09.673165 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7066ac26-5bcb-472a-ba10-c8e08af7f0b3","Type":"ContainerStarted","Data":"807550a9673e70d665f8856618736e5edec6d8a4f7180a65eaf4b5c2a57e70a1"} Mar 10 15:29:09 crc kubenswrapper[4795]: I0310 15:29:09.675012 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 10 15:29:09 crc kubenswrapper[4795]: I0310 15:29:09.683278 4795 generic.go:334] "Generic (PLEG): container finished" podID="01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" containerID="9b6994b2c070241525e196af6e56ba0507f837ee4bf086b63a3c23e47fc38fa3" exitCode=0 Mar 10 15:29:09 crc kubenswrapper[4795]: I0310 15:29:09.683336 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnjgg" event={"ID":"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e","Type":"ContainerDied","Data":"9b6994b2c070241525e196af6e56ba0507f837ee4bf086b63a3c23e47fc38fa3"} Mar 10 15:29:09 crc kubenswrapper[4795]: I0310 15:29:09.707058 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.569538267 podStartE2EDuration="6.707041236s" podCreationTimestamp="2026-03-10 15:29:03 +0000 UTC" firstStartedPulling="2026-03-10 15:29:04.55340968 +0000 UTC m=+1377.719150578" lastFinishedPulling="2026-03-10 15:29:08.690912649 +0000 UTC m=+1381.856653547" observedRunningTime="2026-03-10 15:29:09.704636927 +0000 UTC m=+1382.870377825" watchObservedRunningTime="2026-03-10 15:29:09.707041236 +0000 UTC m=+1382.872782134" Mar 10 15:29:10 crc kubenswrapper[4795]: I0310 15:29:10.982616 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:10 crc kubenswrapper[4795]: I0310 15:29:10.982931 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.059950 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.139514 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle\") pod \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.139577 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts\") pod \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.139702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x4lw\" (UniqueName: \"kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw\") pod \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.139814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data\") pod \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\" (UID: \"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e\") " Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.163982 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts" (OuterVolumeSpecName: "scripts") pod "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" (UID: "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.164113 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw" (OuterVolumeSpecName: "kube-api-access-7x4lw") pod "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" (UID: "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e"). InnerVolumeSpecName "kube-api-access-7x4lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.167321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" (UID: "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.187191 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data" (OuterVolumeSpecName: "config-data") pod "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" (UID: "01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.242581 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.242625 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.242676 4795 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-scripts\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.242689 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x4lw\" (UniqueName: \"kubernetes.io/projected/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e-kube-api-access-7x4lw\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.706132 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-nnjgg" event={"ID":"01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e","Type":"ContainerDied","Data":"59beb03616bc094c53d370e6acddc57931a6de533c985397cddfc0f703741fc9"} Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.706181 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59beb03616bc094c53d370e6acddc57931a6de533c985397cddfc0f703741fc9" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.706178 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-nnjgg" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.916185 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.916476 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-log" containerID="cri-o://9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303" gracePeriod=30 Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.916559 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-api" containerID="cri-o://69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c" gracePeriod=30 Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.922952 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.923146 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.209:8774/\": EOF" Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.942033 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.942376 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b7145813-10b4-4328-9926-429b17af8f0e" containerName="nova-scheduler-scheduler" containerID="cri-o://92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" gracePeriod=30 Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.965120 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.965342 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" containerID="cri-o://5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61" gracePeriod=30 Mar 10 15:29:11 crc kubenswrapper[4795]: I0310 15:29:11.965452 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" containerID="cri-o://7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573" gracePeriod=30 Mar 10 15:29:12 crc kubenswrapper[4795]: E0310 15:29:12.662249 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:29:12 crc kubenswrapper[4795]: E0310 15:29:12.665280 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:29:12 crc kubenswrapper[4795]: E0310 15:29:12.666879 4795 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 10 15:29:12 crc kubenswrapper[4795]: E0310 15:29:12.666927 4795 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b7145813-10b4-4328-9926-429b17af8f0e" containerName="nova-scheduler-scheduler" Mar 10 15:29:12 crc kubenswrapper[4795]: I0310 15:29:12.716671 4795 generic.go:334] "Generic (PLEG): container finished" podID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerID="9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303" exitCode=143 Mar 10 15:29:12 crc kubenswrapper[4795]: I0310 15:29:12.716737 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerDied","Data":"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303"} Mar 10 15:29:12 crc kubenswrapper[4795]: I0310 15:29:12.717996 4795 generic.go:334] "Generic (PLEG): container finished" podID="b2275d74-795c-45b0-a85d-4883b008e039" containerID="5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61" exitCode=143 Mar 10 15:29:12 crc kubenswrapper[4795]: I0310 15:29:12.718020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerDied","Data":"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61"} Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.110811 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:37714->10.217.0.202:8775: read: connection reset by peer" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.110813 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": read tcp 10.217.0.2:37712->10.217.0.202:8775: read: connection reset by peer" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.577377 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.730845 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data\") pod \"b2275d74-795c-45b0-a85d-4883b008e039\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.730884 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs\") pod \"b2275d74-795c-45b0-a85d-4883b008e039\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.731012 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9k6j\" (UniqueName: \"kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j\") pod \"b2275d74-795c-45b0-a85d-4883b008e039\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.731086 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle\") pod \"b2275d74-795c-45b0-a85d-4883b008e039\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.731182 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs\") pod \"b2275d74-795c-45b0-a85d-4883b008e039\" (UID: \"b2275d74-795c-45b0-a85d-4883b008e039\") " Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.731635 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs" (OuterVolumeSpecName: "logs") pod "b2275d74-795c-45b0-a85d-4883b008e039" (UID: "b2275d74-795c-45b0-a85d-4883b008e039"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.735759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j" (OuterVolumeSpecName: "kube-api-access-z9k6j") pod "b2275d74-795c-45b0-a85d-4883b008e039" (UID: "b2275d74-795c-45b0-a85d-4883b008e039"). InnerVolumeSpecName "kube-api-access-z9k6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.759352 4795 generic.go:334] "Generic (PLEG): container finished" podID="b2275d74-795c-45b0-a85d-4883b008e039" containerID="7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573" exitCode=0 Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.759697 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerDied","Data":"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573"} Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.759731 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b2275d74-795c-45b0-a85d-4883b008e039","Type":"ContainerDied","Data":"2301569800a20223c16717e9c179771ef17e86d8812fa86c15993cd81ad14546"} Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.759751 4795 scope.go:117] "RemoveContainer" containerID="7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.759901 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.770019 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2275d74-795c-45b0-a85d-4883b008e039" (UID: "b2275d74-795c-45b0-a85d-4883b008e039"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.783581 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data" (OuterVolumeSpecName: "config-data") pod "b2275d74-795c-45b0-a85d-4883b008e039" (UID: "b2275d74-795c-45b0-a85d-4883b008e039"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.787163 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b2275d74-795c-45b0-a85d-4883b008e039" (UID: "b2275d74-795c-45b0-a85d-4883b008e039"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.834136 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9k6j\" (UniqueName: \"kubernetes.io/projected/b2275d74-795c-45b0-a85d-4883b008e039-kube-api-access-z9k6j\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.834170 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.834179 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.834188 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2275d74-795c-45b0-a85d-4883b008e039-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.834215 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2275d74-795c-45b0-a85d-4883b008e039-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.850902 4795 scope.go:117] "RemoveContainer" containerID="5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.869706 4795 scope.go:117] "RemoveContainer" containerID="7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573" Mar 10 15:29:15 crc kubenswrapper[4795]: E0310 15:29:15.870225 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573\": container with ID starting with 7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573 not found: ID does not exist" containerID="7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.870255 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573"} err="failed to get container status \"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573\": rpc error: code = NotFound desc = could not find container \"7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573\": container with ID starting with 7f5f46fa2277d047a77e4a6afe8a240ccfeafbf7a8604b03273b0bb58aef7573 not found: ID does not exist" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.870280 4795 scope.go:117] "RemoveContainer" containerID="5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61" Mar 10 15:29:15 crc kubenswrapper[4795]: E0310 15:29:15.870677 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61\": container with ID starting with 5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61 not found: ID does not exist" containerID="5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61" Mar 10 15:29:15 crc kubenswrapper[4795]: I0310 15:29:15.870724 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61"} err="failed to get container status \"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61\": rpc error: code = NotFound desc = could not find container \"5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61\": container with ID starting with 5a04ec81c2482fc686bc7bf71f14bb941fb420045cf35edf8c15c0c0979aed61 not found: ID does not exist" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.094032 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.104551 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.122575 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" containerName="nova-manage" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122592 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" containerName="nova-manage" Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.122618 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122627 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.122655 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122666 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.122679 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="init" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122687 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="init" Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.122696 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="dnsmasq-dns" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122705 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="dnsmasq-dns" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122931 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-log" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.122944 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98749e7-dda3-43df-af97-4b521fa4e634" containerName="dnsmasq-dns" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.123004 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2275d74-795c-45b0-a85d-4883b008e039" containerName="nova-metadata-metadata" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.123025 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" containerName="nova-manage" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.124505 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.128783 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.129662 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.132461 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.242338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x7sq\" (UniqueName: \"kubernetes.io/projected/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-kube-api-access-7x7sq\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.242383 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.242415 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-logs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.242459 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-config-data\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.242587 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.344187 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.344337 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x7sq\" (UniqueName: \"kubernetes.io/projected/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-kube-api-access-7x7sq\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.344371 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.344422 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-logs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.345037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-logs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.345179 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-config-data\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.348608 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.349163 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-config-data\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.350647 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.368492 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x7sq\" (UniqueName: \"kubernetes.io/projected/b9d34d9b-f1e9-420e-9b30-d99a9b30f33c-kube-api-access-7x7sq\") pod \"nova-metadata-0\" (UID: \"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c\") " pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.443441 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.751494 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.773030 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.773665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7145813-10b4-4328-9926-429b17af8f0e","Type":"ContainerDied","Data":"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6"} Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.773730 4795 scope.go:117] "RemoveContainer" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.774685 4795 generic.go:334] "Generic (PLEG): container finished" podID="b7145813-10b4-4328-9926-429b17af8f0e" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" exitCode=0 Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.774831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b7145813-10b4-4328-9926-429b17af8f0e","Type":"ContainerDied","Data":"858aa69c44497534fd42bb0cb1241a72ab2f32e8c999b11a5aac2a2cc622a296"} Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.813396 4795 scope.go:117] "RemoveContainer" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" Mar 10 15:29:16 crc kubenswrapper[4795]: E0310 15:29:16.814060 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6\": container with ID starting with 92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6 not found: ID does not exist" containerID="92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.814215 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6"} err="failed to get container status \"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6\": rpc error: code = NotFound desc = could not find container \"92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6\": container with ID starting with 92cf4b82023b5774bb33bb1263b5ae6fd63076a35d8ae2467aa1bff62b987aa6 not found: ID does not exist" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.853754 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle\") pod \"b7145813-10b4-4328-9926-429b17af8f0e\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.853802 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkblz\" (UniqueName: \"kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz\") pod \"b7145813-10b4-4328-9926-429b17af8f0e\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.853916 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data\") pod \"b7145813-10b4-4328-9926-429b17af8f0e\" (UID: \"b7145813-10b4-4328-9926-429b17af8f0e\") " Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.858865 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz" (OuterVolumeSpecName: "kube-api-access-bkblz") pod "b7145813-10b4-4328-9926-429b17af8f0e" (UID: "b7145813-10b4-4328-9926-429b17af8f0e"). InnerVolumeSpecName "kube-api-access-bkblz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.882206 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data" (OuterVolumeSpecName: "config-data") pod "b7145813-10b4-4328-9926-429b17af8f0e" (UID: "b7145813-10b4-4328-9926-429b17af8f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.888423 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7145813-10b4-4328-9926-429b17af8f0e" (UID: "b7145813-10b4-4328-9926-429b17af8f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:16 crc kubenswrapper[4795]: W0310 15:29:16.949839 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9d34d9b_f1e9_420e_9b30_d99a9b30f33c.slice/crio-e2da2fbe4cfd5b219788cb56b101d8c38811fda21145e6320cbbb61f187d9ae9 WatchSource:0}: Error finding container e2da2fbe4cfd5b219788cb56b101d8c38811fda21145e6320cbbb61f187d9ae9: Status 404 returned error can't find the container with id e2da2fbe4cfd5b219788cb56b101d8c38811fda21145e6320cbbb61f187d9ae9 Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.955416 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.955441 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkblz\" (UniqueName: \"kubernetes.io/projected/b7145813-10b4-4328-9926-429b17af8f0e-kube-api-access-bkblz\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.955456 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7145813-10b4-4328-9926-429b17af8f0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:16 crc kubenswrapper[4795]: I0310 15:29:16.955811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.125408 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.147707 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.174129 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:17 crc kubenswrapper[4795]: E0310 15:29:17.174715 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7145813-10b4-4328-9926-429b17af8f0e" containerName="nova-scheduler-scheduler" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.174775 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7145813-10b4-4328-9926-429b17af8f0e" containerName="nova-scheduler-scheduler" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.174989 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7145813-10b4-4328-9926-429b17af8f0e" containerName="nova-scheduler-scheduler" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.175635 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.178484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.192833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.261439 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wf7\" (UniqueName: \"kubernetes.io/projected/020512d0-7890-48cf-8ce2-f9d08feef2e6-kube-api-access-k4wf7\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.261730 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.261830 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-config-data\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.363526 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.363603 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-config-data\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.363639 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wf7\" (UniqueName: \"kubernetes.io/projected/020512d0-7890-48cf-8ce2-f9d08feef2e6-kube-api-access-k4wf7\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.366813 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.368745 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020512d0-7890-48cf-8ce2-f9d08feef2e6-config-data\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.385827 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wf7\" (UniqueName: \"kubernetes.io/projected/020512d0-7890-48cf-8ce2-f9d08feef2e6-kube-api-access-k4wf7\") pod \"nova-scheduler-0\" (UID: \"020512d0-7890-48cf-8ce2-f9d08feef2e6\") " pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.498829 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2275d74-795c-45b0-a85d-4883b008e039" path="/var/lib/kubelet/pods/b2275d74-795c-45b0-a85d-4883b008e039/volumes" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.502425 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7145813-10b4-4328-9926-429b17af8f0e" path="/var/lib/kubelet/pods/b7145813-10b4-4328-9926-429b17af8f0e/volumes" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.561534 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.799241 4795 generic.go:334] "Generic (PLEG): container finished" podID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerID="69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c" exitCode=0 Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.799396 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.799433 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerDied","Data":"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c"} Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.799665 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83e496db-2b6b-411a-ac51-aca119fa90f0","Type":"ContainerDied","Data":"6c1c5eaa44b1837d251e41164ff7fb053ec3ee931aa53303e9f528312bbc7c76"} Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.799689 4795 scope.go:117] "RemoveContainer" containerID="69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.805381 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c","Type":"ContainerStarted","Data":"4ff0652bca02d9dd2631d9f6212fc397152f990911daead80063fcd16e17fdaf"} Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.805422 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c","Type":"ContainerStarted","Data":"29a4168124ce88351e40b7ca9a9f902445b7d1ebc2f2147aed0f43e23a85d1b9"} Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.805437 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9d34d9b-f1e9-420e-9b30-d99a9b30f33c","Type":"ContainerStarted","Data":"e2da2fbe4cfd5b219788cb56b101d8c38811fda21145e6320cbbb61f187d9ae9"} Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.845262 4795 scope.go:117] "RemoveContainer" containerID="9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.847239 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.847228854 podStartE2EDuration="1.847228854s" podCreationTimestamp="2026-03-10 15:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:17.845619538 +0000 UTC m=+1391.011360456" watchObservedRunningTime="2026-03-10 15:29:17.847228854 +0000 UTC m=+1391.012969752" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.871303 4795 scope.go:117] "RemoveContainer" containerID="69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c" Mar 10 15:29:17 crc kubenswrapper[4795]: E0310 15:29:17.871772 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c\": container with ID starting with 69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c not found: ID does not exist" containerID="69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.871809 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c"} err="failed to get container status \"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c\": rpc error: code = NotFound desc = could not find container \"69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c\": container with ID starting with 69e601aa7986aaee4b90176a2eb3a40ef3ce704a10ee5171443c36f947fc6f1c not found: ID does not exist" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.871833 4795 scope.go:117] "RemoveContainer" containerID="9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303" Mar 10 15:29:17 crc kubenswrapper[4795]: E0310 15:29:17.872873 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303\": container with ID starting with 9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303 not found: ID does not exist" containerID="9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.872903 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303"} err="failed to get container status \"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303\": rpc error: code = NotFound desc = could not find container \"9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303\": container with ID starting with 9448d927e735fbdcd078f19dcef1233f3ec1564469dbd7333b0772a7d4a6c303 not found: ID does not exist" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5f2c\" (UniqueName: \"kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877137 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877227 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877288 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.877353 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs\") pod \"83e496db-2b6b-411a-ac51-aca119fa90f0\" (UID: \"83e496db-2b6b-411a-ac51-aca119fa90f0\") " Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.878130 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs" (OuterVolumeSpecName: "logs") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.885754 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c" (OuterVolumeSpecName: "kube-api-access-r5f2c") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "kube-api-access-r5f2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.907459 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data" (OuterVolumeSpecName: "config-data") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.912637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.942357 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.944789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "83e496db-2b6b-411a-ac51-aca119fa90f0" (UID: "83e496db-2b6b-411a-ac51-aca119fa90f0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.980974 4795 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83e496db-2b6b-411a-ac51-aca119fa90f0-logs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.981310 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5f2c\" (UniqueName: \"kubernetes.io/projected/83e496db-2b6b-411a-ac51-aca119fa90f0-kube-api-access-r5f2c\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.981396 4795 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.981432 4795 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.981452 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:17 crc kubenswrapper[4795]: I0310 15:29:17.981470 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83e496db-2b6b-411a-ac51-aca119fa90f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:18 crc kubenswrapper[4795]: W0310 15:29:18.076497 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020512d0_7890_48cf_8ce2_f9d08feef2e6.slice/crio-bde66134fef9d8efe518e17c842ccdc58357f7bf2b82ad64c4d2c5ac8b1ce92c WatchSource:0}: Error finding container bde66134fef9d8efe518e17c842ccdc58357f7bf2b82ad64c4d2c5ac8b1ce92c: Status 404 returned error can't find the container with id bde66134fef9d8efe518e17c842ccdc58357f7bf2b82ad64c4d2c5ac8b1ce92c Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.083297 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.539821 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.540309 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.823215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.825479 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"020512d0-7890-48cf-8ce2-f9d08feef2e6","Type":"ContainerStarted","Data":"a316a2f79b9d3063f2bf7f2b4dba222b285b94ff94b2b3caf1eb03c73ea1b2d2"} Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.825537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"020512d0-7890-48cf-8ce2-f9d08feef2e6","Type":"ContainerStarted","Data":"bde66134fef9d8efe518e17c842ccdc58357f7bf2b82ad64c4d2c5ac8b1ce92c"} Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.848089 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.848063834 podStartE2EDuration="1.848063834s" podCreationTimestamp="2026-03-10 15:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:18.843511573 +0000 UTC m=+1392.009252471" watchObservedRunningTime="2026-03-10 15:29:18.848063834 +0000 UTC m=+1392.013804742" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.869801 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.877547 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.898568 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:18 crc kubenswrapper[4795]: E0310 15:29:18.899041 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-api" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.899057 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-api" Mar 10 15:29:18 crc kubenswrapper[4795]: E0310 15:29:18.899083 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-log" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.899091 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-log" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.899418 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-api" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.899466 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" containerName="nova-api-log" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.900724 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.902950 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.902964 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.904528 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 10 15:29:18 crc kubenswrapper[4795]: I0310 15:29:18.908597 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-config-data\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001374 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-logs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001494 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp84f\" (UniqueName: \"kubernetes.io/projected/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-kube-api-access-hp84f\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.001585 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105448 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105637 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-config-data\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-logs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.105870 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp84f\" (UniqueName: \"kubernetes.io/projected/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-kube-api-access-hp84f\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.106365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-logs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.112038 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.113918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.114514 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-public-tls-certs\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.116312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-config-data\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.125036 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp84f\" (UniqueName: \"kubernetes.io/projected/6edfb391-f92d-4a3c-9e60-e32038dc9f5e-kube-api-access-hp84f\") pod \"nova-api-0\" (UID: \"6edfb391-f92d-4a3c-9e60-e32038dc9f5e\") " pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.219810 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.494979 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83e496db-2b6b-411a-ac51-aca119fa90f0" path="/var/lib/kubelet/pods/83e496db-2b6b-411a-ac51-aca119fa90f0/volumes" Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.701995 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 10 15:29:19 crc kubenswrapper[4795]: I0310 15:29:19.844593 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6edfb391-f92d-4a3c-9e60-e32038dc9f5e","Type":"ContainerStarted","Data":"374dfb3334705e8017afb9ca3ad262066f2e9f845fabf611404ac7d1ef7ccfd7"} Mar 10 15:29:20 crc kubenswrapper[4795]: I0310 15:29:20.853603 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6edfb391-f92d-4a3c-9e60-e32038dc9f5e","Type":"ContainerStarted","Data":"cd1385351771073fb306eedf658174a445bef6d43fdc34096e6a0a539e0489e4"} Mar 10 15:29:20 crc kubenswrapper[4795]: I0310 15:29:20.854107 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6edfb391-f92d-4a3c-9e60-e32038dc9f5e","Type":"ContainerStarted","Data":"5dcd9603110faa5d888a00ba26e7dffa56d536a20b1cec6b69082f36de1fa4ff"} Mar 10 15:29:20 crc kubenswrapper[4795]: I0310 15:29:20.878972 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.878953099 podStartE2EDuration="2.878953099s" podCreationTimestamp="2026-03-10 15:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:29:20.875570172 +0000 UTC m=+1394.041311090" watchObservedRunningTime="2026-03-10 15:29:20.878953099 +0000 UTC m=+1394.044693997" Mar 10 15:29:21 crc kubenswrapper[4795]: I0310 15:29:21.443593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:29:21 crc kubenswrapper[4795]: I0310 15:29:21.443682 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 10 15:29:22 crc kubenswrapper[4795]: I0310 15:29:22.562234 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 10 15:29:26 crc kubenswrapper[4795]: I0310 15:29:26.444523 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:29:26 crc kubenswrapper[4795]: I0310 15:29:26.445119 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 10 15:29:27 crc kubenswrapper[4795]: I0310 15:29:27.459254 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9d34d9b-f1e9-420e-9b30-d99a9b30f33c" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:27 crc kubenswrapper[4795]: I0310 15:29:27.459317 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9d34d9b-f1e9-420e-9b30-d99a9b30f33c" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.212:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:27 crc kubenswrapper[4795]: I0310 15:29:27.562222 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 10 15:29:27 crc kubenswrapper[4795]: I0310 15:29:27.589042 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 10 15:29:27 crc kubenswrapper[4795]: I0310 15:29:27.952547 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 10 15:29:29 crc kubenswrapper[4795]: I0310 15:29:29.220727 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:29 crc kubenswrapper[4795]: I0310 15:29:29.220994 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 10 15:29:30 crc kubenswrapper[4795]: I0310 15:29:30.237272 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6edfb391-f92d-4a3c-9e60-e32038dc9f5e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:30 crc kubenswrapper[4795]: I0310 15:29:30.237284 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6edfb391-f92d-4a3c-9e60-e32038dc9f5e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.214:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 10 15:29:34 crc kubenswrapper[4795]: I0310 15:29:34.031996 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 10 15:29:36 crc kubenswrapper[4795]: I0310 15:29:36.539297 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:29:36 crc kubenswrapper[4795]: I0310 15:29:36.606445 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 10 15:29:36 crc kubenswrapper[4795]: I0310 15:29:36.619612 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:29:37 crc kubenswrapper[4795]: I0310 15:29:37.021544 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 10 15:29:39 crc kubenswrapper[4795]: I0310 15:29:39.231457 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:29:39 crc kubenswrapper[4795]: I0310 15:29:39.232627 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:29:39 crc kubenswrapper[4795]: I0310 15:29:39.233903 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 10 15:29:39 crc kubenswrapper[4795]: I0310 15:29:39.241921 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:29:40 crc kubenswrapper[4795]: I0310 15:29:40.054391 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 10 15:29:40 crc kubenswrapper[4795]: I0310 15:29:40.063912 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 10 15:29:48 crc kubenswrapper[4795]: I0310 15:29:48.091829 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:29:48 crc kubenswrapper[4795]: I0310 15:29:48.538838 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:29:48 crc kubenswrapper[4795]: I0310 15:29:48.539028 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:29:49 crc kubenswrapper[4795]: I0310 15:29:49.134165 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:29:52 crc kubenswrapper[4795]: I0310 15:29:52.122722 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="rabbitmq" containerID="cri-o://1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e" gracePeriod=604796 Mar 10 15:29:53 crc kubenswrapper[4795]: I0310 15:29:53.237656 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="rabbitmq" containerID="cri-o://0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e" gracePeriod=604796 Mar 10 15:29:57 crc kubenswrapper[4795]: I0310 15:29:57.070846 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Mar 10 15:29:57 crc kubenswrapper[4795]: I0310 15:29:57.357535 4795 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.711867 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810401 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810449 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810531 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810558 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810633 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810766 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rv5s\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810785 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.810852 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls\") pod \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\" (UID: \"c1ccf1a8-3778-482d-b6b5-303de43c6a7e\") " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.811316 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.811378 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.811898 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.811918 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.812312 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.817919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.831177 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.833268 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "persistence") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.833364 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info" (OuterVolumeSpecName: "pod-info") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.845218 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s" (OuterVolumeSpecName: "kube-api-access-9rv5s") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "kube-api-access-9rv5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.888573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data" (OuterVolumeSpecName: "config-data") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.894200 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf" (OuterVolumeSpecName: "server-conf") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914158 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914192 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rv5s\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-kube-api-access-9rv5s\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914208 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914219 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914230 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914239 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914252 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.914262 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.950850 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 10 15:29:58 crc kubenswrapper[4795]: I0310 15:29:58.957097 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c1ccf1a8-3778-482d-b6b5-303de43c6a7e" (UID: "c1ccf1a8-3778-482d-b6b5-303de43c6a7e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.016695 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c1ccf1a8-3778-482d-b6b5-303de43c6a7e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.016730 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.265725 4795 generic.go:334] "Generic (PLEG): container finished" podID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerID="1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e" exitCode=0 Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.265932 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerDied","Data":"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e"} Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.266040 4795 scope.go:117] "RemoveContainer" containerID="1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.266040 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.266023 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c1ccf1a8-3778-482d-b6b5-303de43c6a7e","Type":"ContainerDied","Data":"c240581124d1424e85a4e35aae8efb12e4614062bbfd9b8f8a1c31d6e9146809"} Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.289058 4795 scope.go:117] "RemoveContainer" containerID="e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.323065 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.330300 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.354979 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:29:59 crc kubenswrapper[4795]: E0310 15:29:59.355540 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="setup-container" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.355564 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="setup-container" Mar 10 15:29:59 crc kubenswrapper[4795]: E0310 15:29:59.355587 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="rabbitmq" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.355595 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="rabbitmq" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.360410 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" containerName="rabbitmq" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.364790 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.370977 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371012 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371174 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371260 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371321 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371372 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-24pfv" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.371562 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.382430 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424413 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9cd90de-c39a-41a8-92cb-1f2dc799209f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424490 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424520 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424547 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6wc8\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-kube-api-access-j6wc8\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424566 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9cd90de-c39a-41a8-92cb-1f2dc799209f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424610 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424633 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424654 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.424682 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.463936 4795 scope.go:117] "RemoveContainer" containerID="1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e" Mar 10 15:29:59 crc kubenswrapper[4795]: E0310 15:29:59.465483 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e\": container with ID starting with 1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e not found: ID does not exist" containerID="1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.465525 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e"} err="failed to get container status \"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e\": rpc error: code = NotFound desc = could not find container \"1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e\": container with ID starting with 1200d70a88b1df6fbb41d71cec6a6f0c88432461a1d2a4538716a4fc492b5a0e not found: ID does not exist" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.465549 4795 scope.go:117] "RemoveContainer" containerID="e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046" Mar 10 15:29:59 crc kubenswrapper[4795]: E0310 15:29:59.465970 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046\": container with ID starting with e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046 not found: ID does not exist" containerID="e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.465992 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046"} err="failed to get container status \"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046\": rpc error: code = NotFound desc = could not find container \"e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046\": container with ID starting with e3450e979c51a432348677be567a4bf7c3ae11314cba37f08eac63c113254046 not found: ID does not exist" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.487942 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1ccf1a8-3778-482d-b6b5-303de43c6a7e" path="/var/lib/kubelet/pods/c1ccf1a8-3778-482d-b6b5-303de43c6a7e/volumes" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526792 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6wc8\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-kube-api-access-j6wc8\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9cd90de-c39a-41a8-92cb-1f2dc799209f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526910 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526927 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.526958 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.527003 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.527018 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9cd90de-c39a-41a8-92cb-1f2dc799209f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.527617 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.527776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.527872 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.528510 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-config-data\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.528547 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.528663 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b9cd90de-c39a-41a8-92cb-1f2dc799209f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.531459 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b9cd90de-c39a-41a8-92cb-1f2dc799209f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.531553 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b9cd90de-c39a-41a8-92cb-1f2dc799209f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.531740 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.532135 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.545171 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6wc8\" (UniqueName: \"kubernetes.io/projected/b9cd90de-c39a-41a8-92cb-1f2dc799209f-kube-api-access-j6wc8\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.562889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"rabbitmq-server-0\" (UID: \"b9cd90de-c39a-41a8-92cb-1f2dc799209f\") " pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.701874 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.889418 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.933951 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934029 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934065 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934191 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkmzq\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934350 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934406 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934438 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934462 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.934499 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls\") pod \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\" (UID: \"9b0778eb-949e-46e9-bc72-cc42ec440aa2\") " Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.935119 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.938789 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.942147 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.942345 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.942709 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.943122 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.950279 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq" (OuterVolumeSpecName: "kube-api-access-fkmzq") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "kube-api-access-fkmzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.952016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info" (OuterVolumeSpecName: "pod-info") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 10 15:29:59 crc kubenswrapper[4795]: I0310 15:29:59.984672 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data" (OuterVolumeSpecName: "config-data") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.008675 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf" (OuterVolumeSpecName: "server-conf") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036554 4795 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9b0778eb-949e-46e9-bc72-cc42ec440aa2-pod-info\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036593 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036609 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkmzq\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-kube-api-access-fkmzq\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036624 4795 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036636 4795 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-server-conf\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036647 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9b0778eb-949e-46e9-bc72-cc42ec440aa2-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036662 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036673 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036684 4795 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9b0778eb-949e-46e9-bc72-cc42ec440aa2-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.036720 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.069511 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.076913 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9b0778eb-949e-46e9-bc72-cc42ec440aa2" (UID: "9b0778eb-949e-46e9-bc72-cc42ec440aa2"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.139426 4795 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9b0778eb-949e-46e9-bc72-cc42ec440aa2-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.139627 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.140995 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552610-752kh"] Mar 10 15:30:00 crc kubenswrapper[4795]: E0310 15:30:00.141346 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="rabbitmq" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.141371 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="rabbitmq" Mar 10 15:30:00 crc kubenswrapper[4795]: E0310 15:30:00.141410 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="setup-container" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.141416 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="setup-container" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.141575 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerName="rabbitmq" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.142111 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.144453 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.144629 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.145710 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.158741 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-752kh"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.158788 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.160189 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.334499 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.334730 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.336799 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvhpz\" (UniqueName: \"kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.336897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twmfd\" (UniqueName: \"kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd\") pod \"auto-csr-approver-29552610-752kh\" (UID: \"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5\") " pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.336947 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.336991 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.340449 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.381462 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" containerID="0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e" exitCode=0 Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.381616 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.381673 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerDied","Data":"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e"} Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.382580 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"9b0778eb-949e-46e9-bc72-cc42ec440aa2","Type":"ContainerDied","Data":"77e560db3899b1ef8ba7d47cb762ecd235a93da39c5376d9cccb6e3aa082d09a"} Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.382609 4795 scope.go:117] "RemoveContainer" containerID="0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.442928 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvhpz\" (UniqueName: \"kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.443332 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twmfd\" (UniqueName: \"kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd\") pod \"auto-csr-approver-29552610-752kh\" (UID: \"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5\") " pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.443470 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.443531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.444564 4795 scope.go:117] "RemoveContainer" containerID="2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.444658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.450264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.463822 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.466804 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.490577 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twmfd\" (UniqueName: \"kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd\") pod \"auto-csr-approver-29552610-752kh\" (UID: \"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5\") " pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.506315 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvhpz\" (UniqueName: \"kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz\") pod \"collect-profiles-29552610-kkshn\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.507648 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.543391 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.550254 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.552455 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.552673 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-f5j5r" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.552764 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.553970 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.555022 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.560914 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.560641 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.562501 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.594006 4795 scope.go:117] "RemoveContainer" containerID="0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e" Mar 10 15:30:00 crc kubenswrapper[4795]: E0310 15:30:00.594688 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e\": container with ID starting with 0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e not found: ID does not exist" containerID="0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.594727 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e"} err="failed to get container status \"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e\": rpc error: code = NotFound desc = could not find container \"0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e\": container with ID starting with 0810d729a7fde7b1313e8104dca12bc3d68814efa2e2a3e70a7941190273dc1e not found: ID does not exist" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.594771 4795 scope.go:117] "RemoveContainer" containerID="2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85" Mar 10 15:30:00 crc kubenswrapper[4795]: E0310 15:30:00.595085 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85\": container with ID starting with 2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85 not found: ID does not exist" containerID="2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.595112 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85"} err="failed to get container status \"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85\": rpc error: code = NotFound desc = could not find container \"2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85\": container with ID starting with 2b85890740eeb703fca7a88157d922fdf69b9f0c5a581884e7babe645c9f6f85 not found: ID does not exist" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646115 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646189 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646310 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646350 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/460e64c3-819a-4bfe-859f-ea0b5400be76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646365 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646390 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646448 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/460e64c3-819a-4bfe-859f-ea0b5400be76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646463 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646506 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2pph\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-kube-api-access-j2pph\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.646528 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.732198 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750014 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750605 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750695 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750722 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/460e64c3-819a-4bfe-859f-ea0b5400be76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750738 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750814 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/460e64c3-819a-4bfe-859f-ea0b5400be76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750838 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750869 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2pph\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-kube-api-access-j2pph\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750896 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750921 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.750969 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.751474 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.751764 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.752136 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.754039 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.754045 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.754165 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/460e64c3-819a-4bfe-859f-ea0b5400be76-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.757480 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/460e64c3-819a-4bfe-859f-ea0b5400be76-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.757717 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/460e64c3-819a-4bfe-859f-ea0b5400be76-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.758508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.770889 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2pph\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-kube-api-access-j2pph\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.771215 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/460e64c3-819a-4bfe-859f-ea0b5400be76-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:00 crc kubenswrapper[4795]: I0310 15:30:00.808763 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"460e64c3-819a-4bfe-859f-ea0b5400be76\") " pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.005794 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.213057 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-752kh"] Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.285122 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn"] Mar 10 15:30:01 crc kubenswrapper[4795]: W0310 15:30:01.286541 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1c70b3_a273_4464_8299_bf1bc53ee0ce.slice/crio-aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5 WatchSource:0}: Error finding container aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5: Status 404 returned error can't find the container with id aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5 Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.392755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" event={"ID":"4d1c70b3-a273-4464-8299-bf1bc53ee0ce","Type":"ContainerStarted","Data":"aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5"} Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.393589 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-752kh" event={"ID":"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5","Type":"ContainerStarted","Data":"36b8d41d88332b35034c83313776c4dc38c34f234729f257002694f723d82e2a"} Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.394326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9cd90de-c39a-41a8-92cb-1f2dc799209f","Type":"ContainerStarted","Data":"23d0bd796646a967710822956df20f0a10c48e0a3be4e85bcb856bbf9e836767"} Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.488629 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0778eb-949e-46e9-bc72-cc42ec440aa2" path="/var/lib/kubelet/pods/9b0778eb-949e-46e9-bc72-cc42ec440aa2/volumes" Mar 10 15:30:01 crc kubenswrapper[4795]: I0310 15:30:01.510177 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.223150 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.226357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.229052 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.233264 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287255 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287311 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287349 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287407 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287481 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.287529 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v7ks\" (UniqueName: \"kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388548 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388623 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v7ks\" (UniqueName: \"kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388723 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388756 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388783 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388811 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.388835 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.389901 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.389918 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.389930 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.390612 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.390853 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.391542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.404006 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"460e64c3-819a-4bfe-859f-ea0b5400be76","Type":"ContainerStarted","Data":"a3dd6a50a6ca75c59f667b5ee985606664661649b6f4d5b014f141cbac8ce2be"} Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.405134 4795 generic.go:334] "Generic (PLEG): container finished" podID="4d1c70b3-a273-4464-8299-bf1bc53ee0ce" containerID="06b7481474a286e3f5e473bc2958e287fe2e1e85c54958734b5c8dcbdfa0a21b" exitCode=0 Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.405190 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" event={"ID":"4d1c70b3-a273-4464-8299-bf1bc53ee0ce","Type":"ContainerDied","Data":"06b7481474a286e3f5e473bc2958e287fe2e1e85c54958734b5c8dcbdfa0a21b"} Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.406744 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9cd90de-c39a-41a8-92cb-1f2dc799209f","Type":"ContainerStarted","Data":"78828fa71d141129fd642b098d5a569a75a16cee03153195b75a8c419e07d9af"} Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.415436 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v7ks\" (UniqueName: \"kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks\") pod \"dnsmasq-dns-d558885bc-xxf7n\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:02 crc kubenswrapper[4795]: I0310 15:30:02.554545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.418559 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"460e64c3-819a-4bfe-859f-ea0b5400be76","Type":"ContainerStarted","Data":"5c50e643fc7d2f517473132a3cdba7ee99c3684f736f8a5f92dcd984deaad5e5"} Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.465024 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.718890 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.731234 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume\") pod \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.731521 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvhpz\" (UniqueName: \"kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz\") pod \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.731645 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume\") pod \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\" (UID: \"4d1c70b3-a273-4464-8299-bf1bc53ee0ce\") " Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.735428 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume" (OuterVolumeSpecName: "config-volume") pod "4d1c70b3-a273-4464-8299-bf1bc53ee0ce" (UID: "4d1c70b3-a273-4464-8299-bf1bc53ee0ce"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.758287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4d1c70b3-a273-4464-8299-bf1bc53ee0ce" (UID: "4d1c70b3-a273-4464-8299-bf1bc53ee0ce"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.758651 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz" (OuterVolumeSpecName: "kube-api-access-kvhpz") pod "4d1c70b3-a273-4464-8299-bf1bc53ee0ce" (UID: "4d1c70b3-a273-4464-8299-bf1bc53ee0ce"). InnerVolumeSpecName "kube-api-access-kvhpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.833735 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvhpz\" (UniqueName: \"kubernetes.io/projected/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-kube-api-access-kvhpz\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.833782 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:03 crc kubenswrapper[4795]: I0310 15:30:03.833796 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4d1c70b3-a273-4464-8299-bf1bc53ee0ce-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.434255 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.434272 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552610-kkshn" event={"ID":"4d1c70b3-a273-4464-8299-bf1bc53ee0ce","Type":"ContainerDied","Data":"aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5"} Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.434971 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa742de755d2d30f14700b7ef89a250d781b0a868a225ad65419b2d0295cc5e5" Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.436604 4795 generic.go:334] "Generic (PLEG): container finished" podID="b48f6504-950a-40c3-858d-12a81641d180" containerID="1667e1c0d207723c3f1ed684a597536617a85b6c69f07bd159506ad977c46ce7" exitCode=0 Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.436661 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" event={"ID":"b48f6504-950a-40c3-858d-12a81641d180","Type":"ContainerDied","Data":"1667e1c0d207723c3f1ed684a597536617a85b6c69f07bd159506ad977c46ce7"} Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.436687 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" event={"ID":"b48f6504-950a-40c3-858d-12a81641d180","Type":"ContainerStarted","Data":"061e0a51206479568650751b31c9157c823011ca703ce1aa5a2b5ff63b3a51fc"} Mar 10 15:30:04 crc kubenswrapper[4795]: I0310 15:30:04.440177 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-752kh" event={"ID":"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5","Type":"ContainerStarted","Data":"4ba85f207212863dae9dfe324ff3534baf611b9fb69980bbcb335b5e6241edc9"} Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.450974 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" event={"ID":"b48f6504-950a-40c3-858d-12a81641d180","Type":"ContainerStarted","Data":"ef811a407deffe044267fa81b7e5926700f5bb353043d994d889927922e6d65e"} Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.452289 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.452801 4795 generic.go:334] "Generic (PLEG): container finished" podID="2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" containerID="4ba85f207212863dae9dfe324ff3534baf611b9fb69980bbcb335b5e6241edc9" exitCode=0 Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.452888 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-752kh" event={"ID":"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5","Type":"ContainerDied","Data":"4ba85f207212863dae9dfe324ff3534baf611b9fb69980bbcb335b5e6241edc9"} Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.483884 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" podStartSLOduration=3.483867081 podStartE2EDuration="3.483867081s" podCreationTimestamp="2026-03-10 15:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:05.479080085 +0000 UTC m=+1438.644820983" watchObservedRunningTime="2026-03-10 15:30:05.483867081 +0000 UTC m=+1438.649607979" Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.864626 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.972344 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twmfd\" (UniqueName: \"kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd\") pod \"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5\" (UID: \"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5\") " Mar 10 15:30:05 crc kubenswrapper[4795]: I0310 15:30:05.977527 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd" (OuterVolumeSpecName: "kube-api-access-twmfd") pod "2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" (UID: "2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5"). InnerVolumeSpecName "kube-api-access-twmfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.074380 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twmfd\" (UniqueName: \"kubernetes.io/projected/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5-kube-api-access-twmfd\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.464564 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552610-752kh" Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.464775 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552610-752kh" event={"ID":"2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5","Type":"ContainerDied","Data":"36b8d41d88332b35034c83313776c4dc38c34f234729f257002694f723d82e2a"} Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.465102 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36b8d41d88332b35034c83313776c4dc38c34f234729f257002694f723d82e2a" Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.940857 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-2x2lf"] Mar 10 15:30:06 crc kubenswrapper[4795]: I0310 15:30:06.951476 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552604-2x2lf"] Mar 10 15:30:07 crc kubenswrapper[4795]: I0310 15:30:07.488012 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9" path="/var/lib/kubelet/pods/43e6ddbe-efaa-4fe9-a5b8-2e3547d073d9/volumes" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.556267 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.612333 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.612571 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="dnsmasq-dns" containerID="cri-o://1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4" gracePeriod=10 Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.808679 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4k6tc"] Mar 10 15:30:12 crc kubenswrapper[4795]: E0310 15:30:12.809466 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" containerName="oc" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.809485 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" containerName="oc" Mar 10 15:30:12 crc kubenswrapper[4795]: E0310 15:30:12.809541 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d1c70b3-a273-4464-8299-bf1bc53ee0ce" containerName="collect-profiles" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.809550 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1c70b3-a273-4464-8299-bf1bc53ee0ce" containerName="collect-profiles" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.809778 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d1c70b3-a273-4464-8299-bf1bc53ee0ce" containerName="collect-profiles" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.809815 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" containerName="oc" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.811201 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:12 crc kubenswrapper[4795]: I0310 15:30:12.819813 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4k6tc"] Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.014507 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.014552 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.014595 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-config\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.015897 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jj5h\" (UniqueName: \"kubernetes.io/projected/888ca29c-cfea-4f70-8f7b-f0539e3df18b-kube-api-access-9jj5h\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.015955 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.016007 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.016088 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117055 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117123 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117175 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-config\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117256 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jj5h\" (UniqueName: \"kubernetes.io/projected/888ca29c-cfea-4f70-8f7b-f0539e3df18b-kube-api-access-9jj5h\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117281 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.117344 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.118117 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.118202 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.118410 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.118908 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.119055 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-config\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.119257 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/888ca29c-cfea-4f70-8f7b-f0539e3df18b-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.138864 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jj5h\" (UniqueName: \"kubernetes.io/projected/888ca29c-cfea-4f70-8f7b-f0539e3df18b-kube-api-access-9jj5h\") pod \"dnsmasq-dns-78c64bc9c5-4k6tc\" (UID: \"888ca29c-cfea-4f70-8f7b-f0539e3df18b\") " pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.179154 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.256017 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.421433 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.421482 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7v84\" (UniqueName: \"kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.421607 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.422040 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.422098 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.422203 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0\") pod \"202d6bb3-868d-43da-81a0-1321d737fbc8\" (UID: \"202d6bb3-868d-43da-81a0-1321d737fbc8\") " Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.425823 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84" (OuterVolumeSpecName: "kube-api-access-c7v84") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "kube-api-access-c7v84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.479045 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.481553 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config" (OuterVolumeSpecName: "config") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.481598 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.481754 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.491948 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "202d6bb3-868d-43da-81a0-1321d737fbc8" (UID: "202d6bb3-868d-43da-81a0-1321d737fbc8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524556 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524583 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524594 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524605 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524614 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7v84\" (UniqueName: \"kubernetes.io/projected/202d6bb3-868d-43da-81a0-1321d737fbc8-kube-api-access-c7v84\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.524622 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/202d6bb3-868d-43da-81a0-1321d737fbc8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.542116 4795 generic.go:334] "Generic (PLEG): container finished" podID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerID="1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4" exitCode=0 Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.542154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" event={"ID":"202d6bb3-868d-43da-81a0-1321d737fbc8","Type":"ContainerDied","Data":"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4"} Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.542179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" event={"ID":"202d6bb3-868d-43da-81a0-1321d737fbc8","Type":"ContainerDied","Data":"05a73fdbc42d5cbb896ca3ab5ecf18863726d0a0f514dd0714cbb1823c20994a"} Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.542193 4795 scope.go:117] "RemoveContainer" containerID="1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.542230 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bdtrv" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.585416 4795 scope.go:117] "RemoveContainer" containerID="712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.604559 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.616279 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bdtrv"] Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.617025 4795 scope.go:117] "RemoveContainer" containerID="1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4" Mar 10 15:30:13 crc kubenswrapper[4795]: E0310 15:30:13.617384 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4\": container with ID starting with 1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4 not found: ID does not exist" containerID="1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.617428 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4"} err="failed to get container status \"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4\": rpc error: code = NotFound desc = could not find container \"1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4\": container with ID starting with 1f28e49e609737f062780765449d79d9f79126cdb0ea8ee80d0b23165d3a09a4 not found: ID does not exist" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.617454 4795 scope.go:117] "RemoveContainer" containerID="712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d" Mar 10 15:30:13 crc kubenswrapper[4795]: E0310 15:30:13.617731 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d\": container with ID starting with 712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d not found: ID does not exist" containerID="712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d" Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.617754 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d"} err="failed to get container status \"712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d\": rpc error: code = NotFound desc = could not find container \"712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d\": container with ID starting with 712d548dcbdb043e5d9bec46f221b1d16d692ce9e531834474a0d804e287517d not found: ID does not exist" Mar 10 15:30:13 crc kubenswrapper[4795]: W0310 15:30:13.648510 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod888ca29c_cfea_4f70_8f7b_f0539e3df18b.slice/crio-1420143b27bf36fd4dc4d289847f6537b1ec6fa867497ff1f2f549bdca4b438c WatchSource:0}: Error finding container 1420143b27bf36fd4dc4d289847f6537b1ec6fa867497ff1f2f549bdca4b438c: Status 404 returned error can't find the container with id 1420143b27bf36fd4dc4d289847f6537b1ec6fa867497ff1f2f549bdca4b438c Mar 10 15:30:13 crc kubenswrapper[4795]: I0310 15:30:13.651191 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-4k6tc"] Mar 10 15:30:14 crc kubenswrapper[4795]: I0310 15:30:14.555167 4795 generic.go:334] "Generic (PLEG): container finished" podID="888ca29c-cfea-4f70-8f7b-f0539e3df18b" containerID="7d5f82f8af8423c4cca181bf9dfe40047f514c94e102cea6a8f132e48d1ab3a9" exitCode=0 Mar 10 15:30:14 crc kubenswrapper[4795]: I0310 15:30:14.555230 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" event={"ID":"888ca29c-cfea-4f70-8f7b-f0539e3df18b","Type":"ContainerDied","Data":"7d5f82f8af8423c4cca181bf9dfe40047f514c94e102cea6a8f132e48d1ab3a9"} Mar 10 15:30:14 crc kubenswrapper[4795]: I0310 15:30:14.555417 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" event={"ID":"888ca29c-cfea-4f70-8f7b-f0539e3df18b","Type":"ContainerStarted","Data":"1420143b27bf36fd4dc4d289847f6537b1ec6fa867497ff1f2f549bdca4b438c"} Mar 10 15:30:15 crc kubenswrapper[4795]: I0310 15:30:15.487555 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" path="/var/lib/kubelet/pods/202d6bb3-868d-43da-81a0-1321d737fbc8/volumes" Mar 10 15:30:15 crc kubenswrapper[4795]: I0310 15:30:15.566787 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" event={"ID":"888ca29c-cfea-4f70-8f7b-f0539e3df18b","Type":"ContainerStarted","Data":"74b87724e7630fbfb56806f935d40bce3c9956b10c8fd19557bb3b0db96c011f"} Mar 10 15:30:15 crc kubenswrapper[4795]: I0310 15:30:15.566967 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:15 crc kubenswrapper[4795]: I0310 15:30:15.605865 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" podStartSLOduration=3.605839406 podStartE2EDuration="3.605839406s" podCreationTimestamp="2026-03-10 15:30:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:15.60527755 +0000 UTC m=+1448.771018448" watchObservedRunningTime="2026-03-10 15:30:15.605839406 +0000 UTC m=+1448.771580334" Mar 10 15:30:18 crc kubenswrapper[4795]: I0310 15:30:18.538709 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:30:18 crc kubenswrapper[4795]: I0310 15:30:18.539099 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:30:18 crc kubenswrapper[4795]: I0310 15:30:18.539149 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:30:18 crc kubenswrapper[4795]: I0310 15:30:18.540272 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:30:18 crc kubenswrapper[4795]: I0310 15:30:18.540334 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b" gracePeriod=600 Mar 10 15:30:19 crc kubenswrapper[4795]: I0310 15:30:19.626429 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b" exitCode=0 Mar 10 15:30:19 crc kubenswrapper[4795]: I0310 15:30:19.626508 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b"} Mar 10 15:30:19 crc kubenswrapper[4795]: I0310 15:30:19.626977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64"} Mar 10 15:30:19 crc kubenswrapper[4795]: I0310 15:30:19.626996 4795 scope.go:117] "RemoveContainer" containerID="0a98890ab8851ba455c7b85a98ad7dd9a020e91e458aa5b6868584031febf67b" Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.181401 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-4k6tc" Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.256138 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.256866 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="dnsmasq-dns" containerID="cri-o://ef811a407deffe044267fa81b7e5926700f5bb353043d994d889927922e6d65e" gracePeriod=10 Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.669459 4795 generic.go:334] "Generic (PLEG): container finished" podID="b48f6504-950a-40c3-858d-12a81641d180" containerID="ef811a407deffe044267fa81b7e5926700f5bb353043d994d889927922e6d65e" exitCode=0 Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.669499 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" event={"ID":"b48f6504-950a-40c3-858d-12a81641d180","Type":"ContainerDied","Data":"ef811a407deffe044267fa81b7e5926700f5bb353043d994d889927922e6d65e"} Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.768587 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964606 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964786 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v7ks\" (UniqueName: \"kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964903 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964923 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.964959 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.965003 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0\") pod \"b48f6504-950a-40c3-858d-12a81641d180\" (UID: \"b48f6504-950a-40c3-858d-12a81641d180\") " Mar 10 15:30:23 crc kubenswrapper[4795]: I0310 15:30:23.974330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks" (OuterVolumeSpecName: "kube-api-access-6v7ks") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "kube-api-access-6v7ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.015600 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config" (OuterVolumeSpecName: "config") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.016248 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.019185 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.019272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.024426 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.026090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b48f6504-950a-40c3-858d-12a81641d180" (UID: "b48f6504-950a-40c3-858d-12a81641d180"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067428 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067464 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v7ks\" (UniqueName: \"kubernetes.io/projected/b48f6504-950a-40c3-858d-12a81641d180-kube-api-access-6v7ks\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067477 4795 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-config\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067485 4795 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067496 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067507 4795 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.067518 4795 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b48f6504-950a-40c3-858d-12a81641d180-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.681576 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" event={"ID":"b48f6504-950a-40c3-858d-12a81641d180","Type":"ContainerDied","Data":"061e0a51206479568650751b31c9157c823011ca703ce1aa5a2b5ff63b3a51fc"} Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.681645 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-xxf7n" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.681955 4795 scope.go:117] "RemoveContainer" containerID="ef811a407deffe044267fa81b7e5926700f5bb353043d994d889927922e6d65e" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.708461 4795 scope.go:117] "RemoveContainer" containerID="1667e1c0d207723c3f1ed684a597536617a85b6c69f07bd159506ad977c46ce7" Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.712633 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:24 crc kubenswrapper[4795]: I0310 15:30:24.720824 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-xxf7n"] Mar 10 15:30:25 crc kubenswrapper[4795]: I0310 15:30:25.500935 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48f6504-950a-40c3-858d-12a81641d180" path="/var/lib/kubelet/pods/b48f6504-950a-40c3-858d-12a81641d180/volumes" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.117615 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:29 crc kubenswrapper[4795]: E0310 15:30:29.118704 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="init" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.118721 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="init" Mar 10 15:30:29 crc kubenswrapper[4795]: E0310 15:30:29.118733 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.118741 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: E0310 15:30:29.118773 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.118782 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: E0310 15:30:29.118800 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="init" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.118808 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="init" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.119043 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48f6504-950a-40c3-858d-12a81641d180" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.119082 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="202d6bb3-868d-43da-81a0-1321d737fbc8" containerName="dnsmasq-dns" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.120823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.137207 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.275165 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nff4r\" (UniqueName: \"kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.275497 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.275549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.377194 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.377240 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.377309 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nff4r\" (UniqueName: \"kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.377741 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.377748 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.395011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nff4r\" (UniqueName: \"kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r\") pod \"redhat-operators-gp2f9\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.443027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:29 crc kubenswrapper[4795]: W0310 15:30:29.887540 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod953d19fd_ee22_4b03_9a62_de8118c5832f.slice/crio-a7c52cf0b1651f6b64cab2da09377cf2ed57dbbce7ecba3b7d7d15555cb775f3 WatchSource:0}: Error finding container a7c52cf0b1651f6b64cab2da09377cf2ed57dbbce7ecba3b7d7d15555cb775f3: Status 404 returned error can't find the container with id a7c52cf0b1651f6b64cab2da09377cf2ed57dbbce7ecba3b7d7d15555cb775f3 Mar 10 15:30:29 crc kubenswrapper[4795]: I0310 15:30:29.893540 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:30 crc kubenswrapper[4795]: I0310 15:30:30.747218 4795 generic.go:334] "Generic (PLEG): container finished" podID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerID="141f96a28331f4d664fb628467b19391955e1a66d648b9764bb28629b4a8042f" exitCode=0 Mar 10 15:30:30 crc kubenswrapper[4795]: I0310 15:30:30.747273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerDied","Data":"141f96a28331f4d664fb628467b19391955e1a66d648b9764bb28629b4a8042f"} Mar 10 15:30:30 crc kubenswrapper[4795]: I0310 15:30:30.747991 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerStarted","Data":"a7c52cf0b1651f6b64cab2da09377cf2ed57dbbce7ecba3b7d7d15555cb775f3"} Mar 10 15:30:32 crc kubenswrapper[4795]: I0310 15:30:32.769770 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerStarted","Data":"e0ee84f699d954825b94d6b2874e66a1cb53bd371002d8da3011b46a73bcbf3d"} Mar 10 15:30:34 crc kubenswrapper[4795]: I0310 15:30:34.797713 4795 generic.go:334] "Generic (PLEG): container finished" podID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerID="e0ee84f699d954825b94d6b2874e66a1cb53bd371002d8da3011b46a73bcbf3d" exitCode=0 Mar 10 15:30:34 crc kubenswrapper[4795]: I0310 15:30:34.797786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerDied","Data":"e0ee84f699d954825b94d6b2874e66a1cb53bd371002d8da3011b46a73bcbf3d"} Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.808487 4795 generic.go:334] "Generic (PLEG): container finished" podID="b9cd90de-c39a-41a8-92cb-1f2dc799209f" containerID="78828fa71d141129fd642b098d5a569a75a16cee03153195b75a8c419e07d9af" exitCode=0 Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.808571 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9cd90de-c39a-41a8-92cb-1f2dc799209f","Type":"ContainerDied","Data":"78828fa71d141129fd642b098d5a569a75a16cee03153195b75a8c419e07d9af"} Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.811021 4795 generic.go:334] "Generic (PLEG): container finished" podID="460e64c3-819a-4bfe-859f-ea0b5400be76" containerID="5c50e643fc7d2f517473132a3cdba7ee99c3684f736f8a5f92dcd984deaad5e5" exitCode=0 Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.811155 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"460e64c3-819a-4bfe-859f-ea0b5400be76","Type":"ContainerDied","Data":"5c50e643fc7d2f517473132a3cdba7ee99c3684f736f8a5f92dcd984deaad5e5"} Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.813202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerStarted","Data":"2052ad61560cc95a0ff98325874d5f8ba4ff73d216c59af9829ded5f635296ee"} Mar 10 15:30:35 crc kubenswrapper[4795]: I0310 15:30:35.901948 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gp2f9" podStartSLOduration=2.125129499 podStartE2EDuration="6.901929648s" podCreationTimestamp="2026-03-10 15:30:29 +0000 UTC" firstStartedPulling="2026-03-10 15:30:30.751757009 +0000 UTC m=+1463.917497917" lastFinishedPulling="2026-03-10 15:30:35.528557168 +0000 UTC m=+1468.694298066" observedRunningTime="2026-03-10 15:30:35.89886094 +0000 UTC m=+1469.064601838" watchObservedRunningTime="2026-03-10 15:30:35.901929648 +0000 UTC m=+1469.067670546" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.769007 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx"] Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.770313 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.772511 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.788232 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.788264 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.801321 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.804347 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx"] Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.841020 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b9cd90de-c39a-41a8-92cb-1f2dc799209f","Type":"ContainerStarted","Data":"6b749850f9215cbc9724b59f5f9e9a74eb197b65656a2a1730418fec08fcb16f"} Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.842103 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.844741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"460e64c3-819a-4bfe-859f-ea0b5400be76","Type":"ContainerStarted","Data":"8decf20745a3858b704f7273e2e6410fae8701982f94f111602b641ecb84d22f"} Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.845256 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.873260 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.873239251 podStartE2EDuration="37.873239251s" podCreationTimestamp="2026-03-10 15:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:36.867465946 +0000 UTC m=+1470.033206854" watchObservedRunningTime="2026-03-10 15:30:36.873239251 +0000 UTC m=+1470.038980149" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.916177 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.916134828 podStartE2EDuration="36.916134828s" podCreationTimestamp="2026-03-10 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 15:30:36.913165193 +0000 UTC m=+1470.078906091" watchObservedRunningTime="2026-03-10 15:30:36.916134828 +0000 UTC m=+1470.081875726" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.919777 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.919877 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.919945 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:36 crc kubenswrapper[4795]: I0310 15:30:36.919999 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdm52\" (UniqueName: \"kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.022027 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.022147 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.022192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.022233 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdm52\" (UniqueName: \"kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.027246 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.028127 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.028200 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.044392 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdm52\" (UniqueName: \"kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.132276 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.749937 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx"] Mar 10 15:30:37 crc kubenswrapper[4795]: W0310 15:30:37.754465 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d340909_34ac_4a94_89f5_c4759eb3374f.slice/crio-b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe WatchSource:0}: Error finding container b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe: Status 404 returned error can't find the container with id b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe Mar 10 15:30:37 crc kubenswrapper[4795]: I0310 15:30:37.857641 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" event={"ID":"2d340909-34ac-4a94-89f5-c4759eb3374f","Type":"ContainerStarted","Data":"b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe"} Mar 10 15:30:39 crc kubenswrapper[4795]: I0310 15:30:39.444189 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:39 crc kubenswrapper[4795]: I0310 15:30:39.444691 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:40 crc kubenswrapper[4795]: I0310 15:30:40.505406 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gp2f9" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="registry-server" probeResult="failure" output=< Mar 10 15:30:40 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:30:40 crc kubenswrapper[4795]: > Mar 10 15:30:42 crc kubenswrapper[4795]: I0310 15:30:42.458211 4795 scope.go:117] "RemoveContainer" containerID="dbff0b7c150499c38617bf7d5602148c0546863529a9fd0f232fbe78a9a6b56f" Mar 10 15:30:49 crc kubenswrapper[4795]: I0310 15:30:49.494426 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:49 crc kubenswrapper[4795]: I0310 15:30:49.553592 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:49 crc kubenswrapper[4795]: I0310 15:30:49.706330 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 10 15:30:49 crc kubenswrapper[4795]: I0310 15:30:49.757942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:50 crc kubenswrapper[4795]: I0310 15:30:50.980800 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gp2f9" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="registry-server" containerID="cri-o://2052ad61560cc95a0ff98325874d5f8ba4ff73d216c59af9829ded5f635296ee" gracePeriod=2 Mar 10 15:30:51 crc kubenswrapper[4795]: I0310 15:30:51.009787 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 10 15:30:51 crc kubenswrapper[4795]: I0310 15:30:51.994382 4795 generic.go:334] "Generic (PLEG): container finished" podID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerID="2052ad61560cc95a0ff98325874d5f8ba4ff73d216c59af9829ded5f635296ee" exitCode=0 Mar 10 15:30:51 crc kubenswrapper[4795]: I0310 15:30:51.994429 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerDied","Data":"2052ad61560cc95a0ff98325874d5f8ba4ff73d216c59af9829ded5f635296ee"} Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.430527 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.559772 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nff4r\" (UniqueName: \"kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r\") pod \"953d19fd-ee22-4b03-9a62-de8118c5832f\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.560588 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content\") pod \"953d19fd-ee22-4b03-9a62-de8118c5832f\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.560658 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities\") pod \"953d19fd-ee22-4b03-9a62-de8118c5832f\" (UID: \"953d19fd-ee22-4b03-9a62-de8118c5832f\") " Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.561595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities" (OuterVolumeSpecName: "utilities") pod "953d19fd-ee22-4b03-9a62-de8118c5832f" (UID: "953d19fd-ee22-4b03-9a62-de8118c5832f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.567841 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r" (OuterVolumeSpecName: "kube-api-access-nff4r") pod "953d19fd-ee22-4b03-9a62-de8118c5832f" (UID: "953d19fd-ee22-4b03-9a62-de8118c5832f"). InnerVolumeSpecName "kube-api-access-nff4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.663010 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.663048 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nff4r\" (UniqueName: \"kubernetes.io/projected/953d19fd-ee22-4b03-9a62-de8118c5832f-kube-api-access-nff4r\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.701585 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "953d19fd-ee22-4b03-9a62-de8118c5832f" (UID: "953d19fd-ee22-4b03-9a62-de8118c5832f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:30:52 crc kubenswrapper[4795]: I0310 15:30:52.764665 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/953d19fd-ee22-4b03-9a62-de8118c5832f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.007872 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gp2f9" event={"ID":"953d19fd-ee22-4b03-9a62-de8118c5832f","Type":"ContainerDied","Data":"a7c52cf0b1651f6b64cab2da09377cf2ed57dbbce7ecba3b7d7d15555cb775f3"} Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.007947 4795 scope.go:117] "RemoveContainer" containerID="2052ad61560cc95a0ff98325874d5f8ba4ff73d216c59af9829ded5f635296ee" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.008129 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gp2f9" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.018497 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" event={"ID":"2d340909-34ac-4a94-89f5-c4759eb3374f","Type":"ContainerStarted","Data":"c342338359336591f40fd7788e27347be237b7187338b409d27f8aa51806f682"} Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.055725 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" podStartSLOduration=2.658949165 podStartE2EDuration="17.055704464s" podCreationTimestamp="2026-03-10 15:30:36 +0000 UTC" firstStartedPulling="2026-03-10 15:30:37.756675712 +0000 UTC m=+1470.922416610" lastFinishedPulling="2026-03-10 15:30:52.153430991 +0000 UTC m=+1485.319171909" observedRunningTime="2026-03-10 15:30:53.040625613 +0000 UTC m=+1486.206366521" watchObservedRunningTime="2026-03-10 15:30:53.055704464 +0000 UTC m=+1486.221445372" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.068750 4795 scope.go:117] "RemoveContainer" containerID="e0ee84f699d954825b94d6b2874e66a1cb53bd371002d8da3011b46a73bcbf3d" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.071115 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.078843 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gp2f9"] Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.089847 4795 scope.go:117] "RemoveContainer" containerID="141f96a28331f4d664fb628467b19391955e1a66d648b9764bb28629b4a8042f" Mar 10 15:30:53 crc kubenswrapper[4795]: I0310 15:30:53.509311 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" path="/var/lib/kubelet/pods/953d19fd-ee22-4b03-9a62-de8118c5832f/volumes" Mar 10 15:31:03 crc kubenswrapper[4795]: I0310 15:31:03.120689 4795 generic.go:334] "Generic (PLEG): container finished" podID="2d340909-34ac-4a94-89f5-c4759eb3374f" containerID="c342338359336591f40fd7788e27347be237b7187338b409d27f8aa51806f682" exitCode=0 Mar 10 15:31:03 crc kubenswrapper[4795]: I0310 15:31:03.120752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" event={"ID":"2d340909-34ac-4a94-89f5-c4759eb3374f","Type":"ContainerDied","Data":"c342338359336591f40fd7788e27347be237b7187338b409d27f8aa51806f682"} Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.493091 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.584479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam\") pod \"2d340909-34ac-4a94-89f5-c4759eb3374f\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.584548 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle\") pod \"2d340909-34ac-4a94-89f5-c4759eb3374f\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.584701 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdm52\" (UniqueName: \"kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52\") pod \"2d340909-34ac-4a94-89f5-c4759eb3374f\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.584764 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory\") pod \"2d340909-34ac-4a94-89f5-c4759eb3374f\" (UID: \"2d340909-34ac-4a94-89f5-c4759eb3374f\") " Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.589346 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52" (OuterVolumeSpecName: "kube-api-access-sdm52") pod "2d340909-34ac-4a94-89f5-c4759eb3374f" (UID: "2d340909-34ac-4a94-89f5-c4759eb3374f"). InnerVolumeSpecName "kube-api-access-sdm52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.591208 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2d340909-34ac-4a94-89f5-c4759eb3374f" (UID: "2d340909-34ac-4a94-89f5-c4759eb3374f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.610307 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory" (OuterVolumeSpecName: "inventory") pod "2d340909-34ac-4a94-89f5-c4759eb3374f" (UID: "2d340909-34ac-4a94-89f5-c4759eb3374f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.622463 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2d340909-34ac-4a94-89f5-c4759eb3374f" (UID: "2d340909-34ac-4a94-89f5-c4759eb3374f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.686772 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdm52\" (UniqueName: \"kubernetes.io/projected/2d340909-34ac-4a94-89f5-c4759eb3374f-kube-api-access-sdm52\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.686806 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.686815 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:04 crc kubenswrapper[4795]: I0310 15:31:04.686824 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d340909-34ac-4a94-89f5-c4759eb3374f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.143443 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" event={"ID":"2d340909-34ac-4a94-89f5-c4759eb3374f","Type":"ContainerDied","Data":"b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe"} Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.143491 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11a06e742abe155c4b2c31315360d09afbb198b39b770d02d67e8b2fdae86fe" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.144140 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.227350 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd"] Mar 10 15:31:05 crc kubenswrapper[4795]: E0310 15:31:05.227725 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="extract-utilities" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.227744 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="extract-utilities" Mar 10 15:31:05 crc kubenswrapper[4795]: E0310 15:31:05.227767 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="extract-content" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.227775 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="extract-content" Mar 10 15:31:05 crc kubenswrapper[4795]: E0310 15:31:05.227800 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d340909-34ac-4a94-89f5-c4759eb3374f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.227812 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d340909-34ac-4a94-89f5-c4759eb3374f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:05 crc kubenswrapper[4795]: E0310 15:31:05.227829 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="registry-server" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.227835 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="registry-server" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.228005 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d340909-34ac-4a94-89f5-c4759eb3374f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.228024 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="953d19fd-ee22-4b03-9a62-de8118c5832f" containerName="registry-server" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.228695 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.234169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.234432 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.236057 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.236428 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.240104 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd"] Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.399136 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.399227 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86mzb\" (UniqueName: \"kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.399361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.500808 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.500861 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86mzb\" (UniqueName: \"kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.500991 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.504629 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.504715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.527304 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86mzb\" (UniqueName: \"kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ms8kd\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:05 crc kubenswrapper[4795]: I0310 15:31:05.566390 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.205275 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd"] Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.542035 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.544737 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.553117 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.649700 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj7vj\" (UniqueName: \"kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.649771 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.649813 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.751934 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj7vj\" (UniqueName: \"kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.752009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.752595 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.752645 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.752731 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.769228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj7vj\" (UniqueName: \"kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj\") pod \"community-operators-vbzwq\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:06 crc kubenswrapper[4795]: I0310 15:31:06.877926 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:07 crc kubenswrapper[4795]: I0310 15:31:07.164146 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" event={"ID":"89beb105-18d9-49fa-9eda-afe0819518d9","Type":"ContainerStarted","Data":"0008795dfcd3197299ab99e8ce47601db5f963b321f3876390866c43b5bab119"} Mar 10 15:31:07 crc kubenswrapper[4795]: I0310 15:31:07.164501 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" event={"ID":"89beb105-18d9-49fa-9eda-afe0819518d9","Type":"ContainerStarted","Data":"a2aecd61da8f4b85961e2fd4742026ec6c625704ae0e146ba5b0dfa8d4c69b7f"} Mar 10 15:31:07 crc kubenswrapper[4795]: I0310 15:31:07.186549 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" podStartSLOduration=1.732225436 podStartE2EDuration="2.186531602s" podCreationTimestamp="2026-03-10 15:31:05 +0000 UTC" firstStartedPulling="2026-03-10 15:31:06.218355837 +0000 UTC m=+1499.384096735" lastFinishedPulling="2026-03-10 15:31:06.672662003 +0000 UTC m=+1499.838402901" observedRunningTime="2026-03-10 15:31:07.176117974 +0000 UTC m=+1500.341858872" watchObservedRunningTime="2026-03-10 15:31:07.186531602 +0000 UTC m=+1500.352272500" Mar 10 15:31:07 crc kubenswrapper[4795]: I0310 15:31:07.392184 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:07 crc kubenswrapper[4795]: W0310 15:31:07.396155 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8430bff2_c7ee_4f82_989a_8d6cd2e2aaae.slice/crio-f5d1dba979074f23701d0e0d5e24592a377935554574d2d668d0df686dc6b646 WatchSource:0}: Error finding container f5d1dba979074f23701d0e0d5e24592a377935554574d2d668d0df686dc6b646: Status 404 returned error can't find the container with id f5d1dba979074f23701d0e0d5e24592a377935554574d2d668d0df686dc6b646 Mar 10 15:31:08 crc kubenswrapper[4795]: I0310 15:31:08.184413 4795 generic.go:334] "Generic (PLEG): container finished" podID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerID="3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03" exitCode=0 Mar 10 15:31:08 crc kubenswrapper[4795]: I0310 15:31:08.185459 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerDied","Data":"3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03"} Mar 10 15:31:08 crc kubenswrapper[4795]: I0310 15:31:08.185549 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerStarted","Data":"f5d1dba979074f23701d0e0d5e24592a377935554574d2d668d0df686dc6b646"} Mar 10 15:31:09 crc kubenswrapper[4795]: I0310 15:31:09.196051 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerStarted","Data":"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d"} Mar 10 15:31:10 crc kubenswrapper[4795]: I0310 15:31:10.211426 4795 generic.go:334] "Generic (PLEG): container finished" podID="89beb105-18d9-49fa-9eda-afe0819518d9" containerID="0008795dfcd3197299ab99e8ce47601db5f963b321f3876390866c43b5bab119" exitCode=0 Mar 10 15:31:10 crc kubenswrapper[4795]: I0310 15:31:10.211552 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" event={"ID":"89beb105-18d9-49fa-9eda-afe0819518d9","Type":"ContainerDied","Data":"0008795dfcd3197299ab99e8ce47601db5f963b321f3876390866c43b5bab119"} Mar 10 15:31:10 crc kubenswrapper[4795]: I0310 15:31:10.215437 4795 generic.go:334] "Generic (PLEG): container finished" podID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerID="94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d" exitCode=0 Mar 10 15:31:10 crc kubenswrapper[4795]: I0310 15:31:10.215502 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerDied","Data":"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d"} Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.699516 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.864822 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory\") pod \"89beb105-18d9-49fa-9eda-afe0819518d9\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.864913 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86mzb\" (UniqueName: \"kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb\") pod \"89beb105-18d9-49fa-9eda-afe0819518d9\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.865177 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam\") pod \"89beb105-18d9-49fa-9eda-afe0819518d9\" (UID: \"89beb105-18d9-49fa-9eda-afe0819518d9\") " Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.874800 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb" (OuterVolumeSpecName: "kube-api-access-86mzb") pod "89beb105-18d9-49fa-9eda-afe0819518d9" (UID: "89beb105-18d9-49fa-9eda-afe0819518d9"). InnerVolumeSpecName "kube-api-access-86mzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.904317 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory" (OuterVolumeSpecName: "inventory") pod "89beb105-18d9-49fa-9eda-afe0819518d9" (UID: "89beb105-18d9-49fa-9eda-afe0819518d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.904409 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "89beb105-18d9-49fa-9eda-afe0819518d9" (UID: "89beb105-18d9-49fa-9eda-afe0819518d9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.967982 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.968016 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/89beb105-18d9-49fa-9eda-afe0819518d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:11 crc kubenswrapper[4795]: I0310 15:31:11.968025 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86mzb\" (UniqueName: \"kubernetes.io/projected/89beb105-18d9-49fa-9eda-afe0819518d9-kube-api-access-86mzb\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.240648 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" event={"ID":"89beb105-18d9-49fa-9eda-afe0819518d9","Type":"ContainerDied","Data":"a2aecd61da8f4b85961e2fd4742026ec6c625704ae0e146ba5b0dfa8d4c69b7f"} Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.240963 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2aecd61da8f4b85961e2fd4742026ec6c625704ae0e146ba5b0dfa8d4c69b7f" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.240716 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ms8kd" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.243970 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerStarted","Data":"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31"} Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.269831 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vbzwq" podStartSLOduration=2.595735352 podStartE2EDuration="6.269809537s" podCreationTimestamp="2026-03-10 15:31:06 +0000 UTC" firstStartedPulling="2026-03-10 15:31:08.191314813 +0000 UTC m=+1501.357055731" lastFinishedPulling="2026-03-10 15:31:11.865389008 +0000 UTC m=+1505.031129916" observedRunningTime="2026-03-10 15:31:12.264946008 +0000 UTC m=+1505.430686906" watchObservedRunningTime="2026-03-10 15:31:12.269809537 +0000 UTC m=+1505.435550435" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.319934 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6"] Mar 10 15:31:12 crc kubenswrapper[4795]: E0310 15:31:12.320358 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89beb105-18d9-49fa-9eda-afe0819518d9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.320379 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="89beb105-18d9-49fa-9eda-afe0819518d9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.320579 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="89beb105-18d9-49fa-9eda-afe0819518d9" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.321241 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.324570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.325817 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.326136 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.326212 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.330519 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6"] Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.480667 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmp5r\" (UniqueName: \"kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.480761 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.480797 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.480950 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.583245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.583312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.583391 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.583528 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmp5r\" (UniqueName: \"kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.588550 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.589033 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.595547 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.604332 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmp5r\" (UniqueName: \"kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:12 crc kubenswrapper[4795]: I0310 15:31:12.646532 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:31:13 crc kubenswrapper[4795]: I0310 15:31:13.191317 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6"] Mar 10 15:31:13 crc kubenswrapper[4795]: I0310 15:31:13.191971 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:31:13 crc kubenswrapper[4795]: I0310 15:31:13.254458 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" event={"ID":"74f728b8-775b-45f0-90f9-8e4d8e77e5bc","Type":"ContainerStarted","Data":"44c5b49ab22218394511c28e98e901f2c1e51c6fc878e88efcc76ea3bddef488"} Mar 10 15:31:14 crc kubenswrapper[4795]: I0310 15:31:14.263156 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" event={"ID":"74f728b8-775b-45f0-90f9-8e4d8e77e5bc","Type":"ContainerStarted","Data":"9b7d21d58325de0d016ab392915adf81f3943428d11ef7e990b9f97acff2deb5"} Mar 10 15:31:14 crc kubenswrapper[4795]: I0310 15:31:14.289508 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" podStartSLOduration=1.710703433 podStartE2EDuration="2.289490279s" podCreationTimestamp="2026-03-10 15:31:12 +0000 UTC" firstStartedPulling="2026-03-10 15:31:13.191725688 +0000 UTC m=+1506.357466586" lastFinishedPulling="2026-03-10 15:31:13.770512524 +0000 UTC m=+1506.936253432" observedRunningTime="2026-03-10 15:31:14.28217848 +0000 UTC m=+1507.447919378" watchObservedRunningTime="2026-03-10 15:31:14.289490279 +0000 UTC m=+1507.455231177" Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.785562 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.789341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.806227 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.946370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.946470 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:15 crc kubenswrapper[4795]: I0310 15:31:15.946535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzj4\" (UniqueName: \"kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.048660 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.048744 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.048796 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzj4\" (UniqueName: \"kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.049236 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.049421 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.068899 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzj4\" (UniqueName: \"kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4\") pod \"redhat-marketplace-4xl4w\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.165964 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.666445 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.878417 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.878774 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:16 crc kubenswrapper[4795]: I0310 15:31:16.949735 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:17 crc kubenswrapper[4795]: I0310 15:31:17.308859 4795 generic.go:334] "Generic (PLEG): container finished" podID="36709944-7a64-4443-bd13-7cadacb0918c" containerID="e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a" exitCode=0 Mar 10 15:31:17 crc kubenswrapper[4795]: I0310 15:31:17.308928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerDied","Data":"e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a"} Mar 10 15:31:17 crc kubenswrapper[4795]: I0310 15:31:17.308999 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerStarted","Data":"4a4a7e668acacd9efe4f9964b7a42d7ccd3682a080de1178fbc2c9b7fbe272b1"} Mar 10 15:31:17 crc kubenswrapper[4795]: I0310 15:31:17.401506 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:18 crc kubenswrapper[4795]: I0310 15:31:18.322619 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerStarted","Data":"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5"} Mar 10 15:31:19 crc kubenswrapper[4795]: I0310 15:31:19.346385 4795 generic.go:334] "Generic (PLEG): container finished" podID="36709944-7a64-4443-bd13-7cadacb0918c" containerID="d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5" exitCode=0 Mar 10 15:31:19 crc kubenswrapper[4795]: I0310 15:31:19.346453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerDied","Data":"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5"} Mar 10 15:31:19 crc kubenswrapper[4795]: I0310 15:31:19.360282 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:19 crc kubenswrapper[4795]: I0310 15:31:19.360838 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vbzwq" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="registry-server" containerID="cri-o://a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31" gracePeriod=2 Mar 10 15:31:19 crc kubenswrapper[4795]: I0310 15:31:19.944776 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.028690 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj7vj\" (UniqueName: \"kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj\") pod \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.028957 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities\") pod \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.029385 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content\") pod \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\" (UID: \"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae\") " Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.029956 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities" (OuterVolumeSpecName: "utilities") pod "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" (UID: "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.030509 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.036207 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj" (OuterVolumeSpecName: "kube-api-access-wj7vj") pod "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" (UID: "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae"). InnerVolumeSpecName "kube-api-access-wj7vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.104225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" (UID: "8430bff2-c7ee-4f82-989a-8d6cd2e2aaae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.131989 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj7vj\" (UniqueName: \"kubernetes.io/projected/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-kube-api-access-wj7vj\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.132019 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.362434 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerStarted","Data":"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69"} Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.368037 4795 generic.go:334] "Generic (PLEG): container finished" podID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerID="a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31" exitCode=0 Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.368099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerDied","Data":"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31"} Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.368146 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vbzwq" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.368527 4795 scope.go:117] "RemoveContainer" containerID="a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.368430 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vbzwq" event={"ID":"8430bff2-c7ee-4f82-989a-8d6cd2e2aaae","Type":"ContainerDied","Data":"f5d1dba979074f23701d0e0d5e24592a377935554574d2d668d0df686dc6b646"} Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.405014 4795 scope.go:117] "RemoveContainer" containerID="94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.413428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xl4w" podStartSLOduration=2.927207875 podStartE2EDuration="5.413401411s" podCreationTimestamp="2026-03-10 15:31:15 +0000 UTC" firstStartedPulling="2026-03-10 15:31:17.311825652 +0000 UTC m=+1510.477566600" lastFinishedPulling="2026-03-10 15:31:19.798019228 +0000 UTC m=+1512.963760136" observedRunningTime="2026-03-10 15:31:20.387405268 +0000 UTC m=+1513.553146166" watchObservedRunningTime="2026-03-10 15:31:20.413401411 +0000 UTC m=+1513.579142319" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.439894 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.445658 4795 scope.go:117] "RemoveContainer" containerID="3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.447652 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vbzwq"] Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.488739 4795 scope.go:117] "RemoveContainer" containerID="a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31" Mar 10 15:31:20 crc kubenswrapper[4795]: E0310 15:31:20.489499 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31\": container with ID starting with a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31 not found: ID does not exist" containerID="a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.489616 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31"} err="failed to get container status \"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31\": rpc error: code = NotFound desc = could not find container \"a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31\": container with ID starting with a54df2a6fa3118478538500551e330c2dc0d32499dc4fb204c003a383a023c31 not found: ID does not exist" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.489716 4795 scope.go:117] "RemoveContainer" containerID="94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d" Mar 10 15:31:20 crc kubenswrapper[4795]: E0310 15:31:20.490118 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d\": container with ID starting with 94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d not found: ID does not exist" containerID="94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.490142 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d"} err="failed to get container status \"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d\": rpc error: code = NotFound desc = could not find container \"94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d\": container with ID starting with 94accb3e2cdd7166e993bfb15028a353982285ddc8c1fdea1c99125b2ce1bf5d not found: ID does not exist" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.490160 4795 scope.go:117] "RemoveContainer" containerID="3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03" Mar 10 15:31:20 crc kubenswrapper[4795]: E0310 15:31:20.490450 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03\": container with ID starting with 3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03 not found: ID does not exist" containerID="3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03" Mar 10 15:31:20 crc kubenswrapper[4795]: I0310 15:31:20.490557 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03"} err="failed to get container status \"3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03\": rpc error: code = NotFound desc = could not find container \"3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03\": container with ID starting with 3dd8155af6c1b7934166d96b8d9a8b36c1181615a8a851075fd93c47882f5b03 not found: ID does not exist" Mar 10 15:31:21 crc kubenswrapper[4795]: I0310 15:31:21.497139 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" path="/var/lib/kubelet/pods/8430bff2-c7ee-4f82-989a-8d6cd2e2aaae/volumes" Mar 10 15:31:26 crc kubenswrapper[4795]: I0310 15:31:26.166940 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:26 crc kubenswrapper[4795]: I0310 15:31:26.167509 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:26 crc kubenswrapper[4795]: I0310 15:31:26.211907 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:26 crc kubenswrapper[4795]: I0310 15:31:26.526413 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:26 crc kubenswrapper[4795]: I0310 15:31:26.606166 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:28 crc kubenswrapper[4795]: I0310 15:31:28.506246 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xl4w" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="registry-server" containerID="cri-o://a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69" gracePeriod=2 Mar 10 15:31:28 crc kubenswrapper[4795]: I0310 15:31:28.933848 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.015151 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content\") pod \"36709944-7a64-4443-bd13-7cadacb0918c\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.015405 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities\") pod \"36709944-7a64-4443-bd13-7cadacb0918c\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.015477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdzj4\" (UniqueName: \"kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4\") pod \"36709944-7a64-4443-bd13-7cadacb0918c\" (UID: \"36709944-7a64-4443-bd13-7cadacb0918c\") " Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.017229 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities" (OuterVolumeSpecName: "utilities") pod "36709944-7a64-4443-bd13-7cadacb0918c" (UID: "36709944-7a64-4443-bd13-7cadacb0918c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.021452 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4" (OuterVolumeSpecName: "kube-api-access-rdzj4") pod "36709944-7a64-4443-bd13-7cadacb0918c" (UID: "36709944-7a64-4443-bd13-7cadacb0918c"). InnerVolumeSpecName "kube-api-access-rdzj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.048289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36709944-7a64-4443-bd13-7cadacb0918c" (UID: "36709944-7a64-4443-bd13-7cadacb0918c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.117729 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdzj4\" (UniqueName: \"kubernetes.io/projected/36709944-7a64-4443-bd13-7cadacb0918c-kube-api-access-rdzj4\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.117772 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.117786 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36709944-7a64-4443-bd13-7cadacb0918c-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.516740 4795 generic.go:334] "Generic (PLEG): container finished" podID="36709944-7a64-4443-bd13-7cadacb0918c" containerID="a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69" exitCode=0 Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.516837 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerDied","Data":"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69"} Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.517154 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xl4w" event={"ID":"36709944-7a64-4443-bd13-7cadacb0918c","Type":"ContainerDied","Data":"4a4a7e668acacd9efe4f9964b7a42d7ccd3682a080de1178fbc2c9b7fbe272b1"} Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.517186 4795 scope.go:117] "RemoveContainer" containerID="a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.516884 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xl4w" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.559021 4795 scope.go:117] "RemoveContainer" containerID="d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.570274 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.589682 4795 scope.go:117] "RemoveContainer" containerID="e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.594102 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xl4w"] Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.637346 4795 scope.go:117] "RemoveContainer" containerID="a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69" Mar 10 15:31:29 crc kubenswrapper[4795]: E0310 15:31:29.638192 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69\": container with ID starting with a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69 not found: ID does not exist" containerID="a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.638235 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69"} err="failed to get container status \"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69\": rpc error: code = NotFound desc = could not find container \"a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69\": container with ID starting with a13c383e720f59137375f9185cbac12edb6955d57c3dfa4cd2ce18ff68870f69 not found: ID does not exist" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.638261 4795 scope.go:117] "RemoveContainer" containerID="d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5" Mar 10 15:31:29 crc kubenswrapper[4795]: E0310 15:31:29.638563 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5\": container with ID starting with d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5 not found: ID does not exist" containerID="d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.638593 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5"} err="failed to get container status \"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5\": rpc error: code = NotFound desc = could not find container \"d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5\": container with ID starting with d9c6e13fe4a39d6d02228d64e018c8c5b4c65dbe87dcc9703b74663b8d0da3f5 not found: ID does not exist" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.638611 4795 scope.go:117] "RemoveContainer" containerID="e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a" Mar 10 15:31:29 crc kubenswrapper[4795]: E0310 15:31:29.638876 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a\": container with ID starting with e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a not found: ID does not exist" containerID="e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a" Mar 10 15:31:29 crc kubenswrapper[4795]: I0310 15:31:29.639573 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a"} err="failed to get container status \"e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a\": rpc error: code = NotFound desc = could not find container \"e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a\": container with ID starting with e007742f19768fb78ea798f7062316b7c60c427a4a0e9ff3ce3882be3f9bb72a not found: ID does not exist" Mar 10 15:31:31 crc kubenswrapper[4795]: I0310 15:31:31.486369 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36709944-7a64-4443-bd13-7cadacb0918c" path="/var/lib/kubelet/pods/36709944-7a64-4443-bd13-7cadacb0918c/volumes" Mar 10 15:31:42 crc kubenswrapper[4795]: I0310 15:31:42.657365 4795 scope.go:117] "RemoveContainer" containerID="feedcd326acfb71c4075d5a088bbd09fcec66b1c401a6407d73cf777ac68913f" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.150466 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552612-xxx2v"] Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151330 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151343 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151356 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151362 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151378 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151385 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151402 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151409 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151418 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151424 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="extract-utilities" Mar 10 15:32:00 crc kubenswrapper[4795]: E0310 15:32:00.151437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151443 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="extract-content" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151629 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8430bff2-c7ee-4f82-989a-8d6cd2e2aaae" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.151655 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="36709944-7a64-4443-bd13-7cadacb0918c" containerName="registry-server" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.152213 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.155714 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.156141 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.156313 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.352229 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-xxx2v"] Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.447839 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkrfm\" (UniqueName: \"kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm\") pod \"auto-csr-approver-29552612-xxx2v\" (UID: \"a771fc14-3d9d-440c-a4e6-7e02569919c1\") " pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.549355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkrfm\" (UniqueName: \"kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm\") pod \"auto-csr-approver-29552612-xxx2v\" (UID: \"a771fc14-3d9d-440c-a4e6-7e02569919c1\") " pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.572877 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkrfm\" (UniqueName: \"kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm\") pod \"auto-csr-approver-29552612-xxx2v\" (UID: \"a771fc14-3d9d-440c-a4e6-7e02569919c1\") " pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:00 crc kubenswrapper[4795]: I0310 15:32:00.649139 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:01 crc kubenswrapper[4795]: I0310 15:32:01.076797 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-xxx2v"] Mar 10 15:32:01 crc kubenswrapper[4795]: I0310 15:32:01.922393 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" event={"ID":"a771fc14-3d9d-440c-a4e6-7e02569919c1","Type":"ContainerStarted","Data":"c944d45ece13fda8113ee14f93bb6cc2ccf624a9818c7ab7e6b6eb1ffd2eeb40"} Mar 10 15:32:02 crc kubenswrapper[4795]: I0310 15:32:02.944839 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" event={"ID":"a771fc14-3d9d-440c-a4e6-7e02569919c1","Type":"ContainerStarted","Data":"f6d987a694f05b8327d82ec0378cddffa2915c07f778d57c9fd7c571688f2765"} Mar 10 15:32:02 crc kubenswrapper[4795]: I0310 15:32:02.975654 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" podStartSLOduration=1.6572140850000001 podStartE2EDuration="2.975628787s" podCreationTimestamp="2026-03-10 15:32:00 +0000 UTC" firstStartedPulling="2026-03-10 15:32:01.087423076 +0000 UTC m=+1554.253163974" lastFinishedPulling="2026-03-10 15:32:02.405837738 +0000 UTC m=+1555.571578676" observedRunningTime="2026-03-10 15:32:02.960727391 +0000 UTC m=+1556.126468299" watchObservedRunningTime="2026-03-10 15:32:02.975628787 +0000 UTC m=+1556.141369695" Mar 10 15:32:03 crc kubenswrapper[4795]: I0310 15:32:03.958736 4795 generic.go:334] "Generic (PLEG): container finished" podID="a771fc14-3d9d-440c-a4e6-7e02569919c1" containerID="f6d987a694f05b8327d82ec0378cddffa2915c07f778d57c9fd7c571688f2765" exitCode=0 Mar 10 15:32:03 crc kubenswrapper[4795]: I0310 15:32:03.958831 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" event={"ID":"a771fc14-3d9d-440c-a4e6-7e02569919c1","Type":"ContainerDied","Data":"f6d987a694f05b8327d82ec0378cddffa2915c07f778d57c9fd7c571688f2765"} Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.282378 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.454813 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkrfm\" (UniqueName: \"kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm\") pod \"a771fc14-3d9d-440c-a4e6-7e02569919c1\" (UID: \"a771fc14-3d9d-440c-a4e6-7e02569919c1\") " Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.461453 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm" (OuterVolumeSpecName: "kube-api-access-qkrfm") pod "a771fc14-3d9d-440c-a4e6-7e02569919c1" (UID: "a771fc14-3d9d-440c-a4e6-7e02569919c1"). InnerVolumeSpecName "kube-api-access-qkrfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.557495 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkrfm\" (UniqueName: \"kubernetes.io/projected/a771fc14-3d9d-440c-a4e6-7e02569919c1-kube-api-access-qkrfm\") on node \"crc\" DevicePath \"\"" Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.976917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" event={"ID":"a771fc14-3d9d-440c-a4e6-7e02569919c1","Type":"ContainerDied","Data":"c944d45ece13fda8113ee14f93bb6cc2ccf624a9818c7ab7e6b6eb1ffd2eeb40"} Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.976958 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c944d45ece13fda8113ee14f93bb6cc2ccf624a9818c7ab7e6b6eb1ffd2eeb40" Mar 10 15:32:05 crc kubenswrapper[4795]: I0310 15:32:05.976959 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552612-xxx2v" Mar 10 15:32:06 crc kubenswrapper[4795]: I0310 15:32:06.028942 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-pxjj6"] Mar 10 15:32:06 crc kubenswrapper[4795]: I0310 15:32:06.037060 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552606-pxjj6"] Mar 10 15:32:07 crc kubenswrapper[4795]: I0310 15:32:07.493005 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f110d7f5-06c8-4211-b21e-7a52e6b694b7" path="/var/lib/kubelet/pods/f110d7f5-06c8-4211-b21e-7a52e6b694b7/volumes" Mar 10 15:32:18 crc kubenswrapper[4795]: I0310 15:32:18.539875 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:32:18 crc kubenswrapper[4795]: I0310 15:32:18.541046 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:32:42 crc kubenswrapper[4795]: I0310 15:32:42.820748 4795 scope.go:117] "RemoveContainer" containerID="136554a0c840e18c932b5d01915abcb6c227714fdc2a18d7f1f6da7fbf6dc48a" Mar 10 15:32:42 crc kubenswrapper[4795]: I0310 15:32:42.872818 4795 scope.go:117] "RemoveContainer" containerID="53f0e6f7d8f377f6099f2b35e606b666a5c2ba24737f78a1a2dad32e126a9957" Mar 10 15:32:42 crc kubenswrapper[4795]: I0310 15:32:42.924974 4795 scope.go:117] "RemoveContainer" containerID="496a0ce2c98b076ed87eb51f2cf780382ff8c97977b8e27b77d3a409e489d75f" Mar 10 15:32:42 crc kubenswrapper[4795]: I0310 15:32:42.985592 4795 scope.go:117] "RemoveContainer" containerID="dcdd2a66ae19a75a61ddb0c935ce0e52157cfcda7bd2a89358b39260b0cc667d" Mar 10 15:32:48 crc kubenswrapper[4795]: I0310 15:32:48.538827 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:32:48 crc kubenswrapper[4795]: I0310 15:32:48.539292 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.539857 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.540393 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.540440 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.541180 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.541232 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" gracePeriod=600 Mar 10 15:33:18 crc kubenswrapper[4795]: E0310 15:33:18.675743 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.761695 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" exitCode=0 Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.761755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64"} Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.761811 4795 scope.go:117] "RemoveContainer" containerID="bb8b52e71ef30156b6c9527b50e9705f2708ddbef05f7d631d0751d8aba7942b" Mar 10 15:33:18 crc kubenswrapper[4795]: I0310 15:33:18.762515 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:33:18 crc kubenswrapper[4795]: E0310 15:33:18.762823 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:33:33 crc kubenswrapper[4795]: I0310 15:33:33.476472 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:33:33 crc kubenswrapper[4795]: E0310 15:33:33.477343 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:33:43 crc kubenswrapper[4795]: I0310 15:33:43.097252 4795 scope.go:117] "RemoveContainer" containerID="ddcfcf311b3e75a7bed53df2080a34c754951c97da4d0aeb0dc6e5f01217dc9f" Mar 10 15:33:43 crc kubenswrapper[4795]: I0310 15:33:43.119814 4795 scope.go:117] "RemoveContainer" containerID="edcbcbaf212192eea93d3dbf886acd2b7e63ae57306352bb5381cc283e0965f7" Mar 10 15:33:43 crc kubenswrapper[4795]: I0310 15:33:43.138285 4795 scope.go:117] "RemoveContainer" containerID="1bec50d4abda5db782460c5b037001cb98c30bcc2d8d2f9ec800984b3bcbdd11" Mar 10 15:33:43 crc kubenswrapper[4795]: I0310 15:33:43.163999 4795 scope.go:117] "RemoveContainer" containerID="24b45801b3a62d365af23aed3af835c69aa98b3e0745f8992fd0bf396f21c584" Mar 10 15:33:47 crc kubenswrapper[4795]: I0310 15:33:47.484263 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:33:47 crc kubenswrapper[4795]: E0310 15:33:47.484869 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:33:59 crc kubenswrapper[4795]: I0310 15:33:59.163326 4795 generic.go:334] "Generic (PLEG): container finished" podID="74f728b8-775b-45f0-90f9-8e4d8e77e5bc" containerID="9b7d21d58325de0d016ab392915adf81f3943428d11ef7e990b9f97acff2deb5" exitCode=0 Mar 10 15:33:59 crc kubenswrapper[4795]: I0310 15:33:59.163397 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" event={"ID":"74f728b8-775b-45f0-90f9-8e4d8e77e5bc","Type":"ContainerDied","Data":"9b7d21d58325de0d016ab392915adf81f3943428d11ef7e990b9f97acff2deb5"} Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.154368 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552614-lkqnr"] Mar 10 15:34:00 crc kubenswrapper[4795]: E0310 15:34:00.154804 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a771fc14-3d9d-440c-a4e6-7e02569919c1" containerName="oc" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.154816 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a771fc14-3d9d-440c-a4e6-7e02569919c1" containerName="oc" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.155026 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a771fc14-3d9d-440c-a4e6-7e02569919c1" containerName="oc" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.155650 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-lkqnr"] Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.155724 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.176335 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.176854 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.183266 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.304908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5g6f\" (UniqueName: \"kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f\") pod \"auto-csr-approver-29552614-lkqnr\" (UID: \"0990c949-f10b-4a1c-8bae-62f0ea1c0605\") " pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.406821 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5g6f\" (UniqueName: \"kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f\") pod \"auto-csr-approver-29552614-lkqnr\" (UID: \"0990c949-f10b-4a1c-8bae-62f0ea1c0605\") " pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.429040 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5g6f\" (UniqueName: \"kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f\") pod \"auto-csr-approver-29552614-lkqnr\" (UID: \"0990c949-f10b-4a1c-8bae-62f0ea1c0605\") " pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.477063 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:34:00 crc kubenswrapper[4795]: E0310 15:34:00.477424 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.506823 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.605212 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.712758 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmp5r\" (UniqueName: \"kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r\") pod \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.712814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle\") pod \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.712913 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam\") pod \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.712960 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory\") pod \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\" (UID: \"74f728b8-775b-45f0-90f9-8e4d8e77e5bc\") " Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.732123 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "74f728b8-775b-45f0-90f9-8e4d8e77e5bc" (UID: "74f728b8-775b-45f0-90f9-8e4d8e77e5bc"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.732534 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r" (OuterVolumeSpecName: "kube-api-access-nmp5r") pod "74f728b8-775b-45f0-90f9-8e4d8e77e5bc" (UID: "74f728b8-775b-45f0-90f9-8e4d8e77e5bc"). InnerVolumeSpecName "kube-api-access-nmp5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.741958 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "74f728b8-775b-45f0-90f9-8e4d8e77e5bc" (UID: "74f728b8-775b-45f0-90f9-8e4d8e77e5bc"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.751225 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory" (OuterVolumeSpecName: "inventory") pod "74f728b8-775b-45f0-90f9-8e4d8e77e5bc" (UID: "74f728b8-775b-45f0-90f9-8e4d8e77e5bc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.815381 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmp5r\" (UniqueName: \"kubernetes.io/projected/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-kube-api-access-nmp5r\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.815413 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.815501 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.815636 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/74f728b8-775b-45f0-90f9-8e4d8e77e5bc-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:00 crc kubenswrapper[4795]: I0310 15:34:00.955993 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-lkqnr"] Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.199569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" event={"ID":"74f728b8-775b-45f0-90f9-8e4d8e77e5bc","Type":"ContainerDied","Data":"44c5b49ab22218394511c28e98e901f2c1e51c6fc878e88efcc76ea3bddef488"} Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.199622 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c5b49ab22218394511c28e98e901f2c1e51c6fc878e88efcc76ea3bddef488" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.199600 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.200850 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" event={"ID":"0990c949-f10b-4a1c-8bae-62f0ea1c0605","Type":"ContainerStarted","Data":"35bfbf25915cdfdbc0c7a2a52eb604e1c54836da690e439212b546c7419d0f5d"} Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.329242 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r"] Mar 10 15:34:01 crc kubenswrapper[4795]: E0310 15:34:01.329703 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f728b8-775b-45f0-90f9-8e4d8e77e5bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.329727 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f728b8-775b-45f0-90f9-8e4d8e77e5bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.329994 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f728b8-775b-45f0-90f9-8e4d8e77e5bc" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.330807 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.332868 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.333530 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.333969 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.334026 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.337630 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r"] Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.436504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.436650 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpg75\" (UniqueName: \"kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.436707 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.539058 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.540369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpg75\" (UniqueName: \"kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.540487 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.544242 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.545151 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.558769 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpg75\" (UniqueName: \"kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-d992r\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:01 crc kubenswrapper[4795]: I0310 15:34:01.648653 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:34:02 crc kubenswrapper[4795]: I0310 15:34:02.160476 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r"] Mar 10 15:34:02 crc kubenswrapper[4795]: W0310 15:34:02.164176 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d84f498_2364_4d50_8dfa_c49547c2e29a.slice/crio-7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9 WatchSource:0}: Error finding container 7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9: Status 404 returned error can't find the container with id 7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9 Mar 10 15:34:02 crc kubenswrapper[4795]: I0310 15:34:02.211258 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" event={"ID":"0d84f498-2364-4d50-8dfa-c49547c2e29a","Type":"ContainerStarted","Data":"7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9"} Mar 10 15:34:03 crc kubenswrapper[4795]: I0310 15:34:03.222182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" event={"ID":"0d84f498-2364-4d50-8dfa-c49547c2e29a","Type":"ContainerStarted","Data":"06d93b83f3edaf3d0eeafd822451198ece4280f9f5999f9deaa3fdbb79c64210"} Mar 10 15:34:03 crc kubenswrapper[4795]: I0310 15:34:03.226263 4795 generic.go:334] "Generic (PLEG): container finished" podID="0990c949-f10b-4a1c-8bae-62f0ea1c0605" containerID="67f41b05488a9c156c19dcb82206414d7badb3dd10ca05b347d891a12bda93a6" exitCode=0 Mar 10 15:34:03 crc kubenswrapper[4795]: I0310 15:34:03.226319 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" event={"ID":"0990c949-f10b-4a1c-8bae-62f0ea1c0605","Type":"ContainerDied","Data":"67f41b05488a9c156c19dcb82206414d7badb3dd10ca05b347d891a12bda93a6"} Mar 10 15:34:03 crc kubenswrapper[4795]: I0310 15:34:03.238032 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" podStartSLOduration=1.780974023 podStartE2EDuration="2.237991006s" podCreationTimestamp="2026-03-10 15:34:01 +0000 UTC" firstStartedPulling="2026-03-10 15:34:02.167497875 +0000 UTC m=+1675.333238773" lastFinishedPulling="2026-03-10 15:34:02.624514858 +0000 UTC m=+1675.790255756" observedRunningTime="2026-03-10 15:34:03.235819744 +0000 UTC m=+1676.401560692" watchObservedRunningTime="2026-03-10 15:34:03.237991006 +0000 UTC m=+1676.403731904" Mar 10 15:34:04 crc kubenswrapper[4795]: I0310 15:34:04.582053 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:04 crc kubenswrapper[4795]: I0310 15:34:04.699668 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5g6f\" (UniqueName: \"kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f\") pod \"0990c949-f10b-4a1c-8bae-62f0ea1c0605\" (UID: \"0990c949-f10b-4a1c-8bae-62f0ea1c0605\") " Mar 10 15:34:04 crc kubenswrapper[4795]: I0310 15:34:04.705257 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f" (OuterVolumeSpecName: "kube-api-access-q5g6f") pod "0990c949-f10b-4a1c-8bae-62f0ea1c0605" (UID: "0990c949-f10b-4a1c-8bae-62f0ea1c0605"). InnerVolumeSpecName "kube-api-access-q5g6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:34:04 crc kubenswrapper[4795]: I0310 15:34:04.803177 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5g6f\" (UniqueName: \"kubernetes.io/projected/0990c949-f10b-4a1c-8bae-62f0ea1c0605-kube-api-access-q5g6f\") on node \"crc\" DevicePath \"\"" Mar 10 15:34:05 crc kubenswrapper[4795]: I0310 15:34:05.248028 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" event={"ID":"0990c949-f10b-4a1c-8bae-62f0ea1c0605","Type":"ContainerDied","Data":"35bfbf25915cdfdbc0c7a2a52eb604e1c54836da690e439212b546c7419d0f5d"} Mar 10 15:34:05 crc kubenswrapper[4795]: I0310 15:34:05.248089 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35bfbf25915cdfdbc0c7a2a52eb604e1c54836da690e439212b546c7419d0f5d" Mar 10 15:34:05 crc kubenswrapper[4795]: I0310 15:34:05.248189 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552614-lkqnr" Mar 10 15:34:05 crc kubenswrapper[4795]: I0310 15:34:05.651590 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-vwfhn"] Mar 10 15:34:05 crc kubenswrapper[4795]: I0310 15:34:05.660924 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552608-vwfhn"] Mar 10 15:34:07 crc kubenswrapper[4795]: I0310 15:34:07.489329 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0d7784-e8d2-4b84-9332-185029fb41aa" path="/var/lib/kubelet/pods/0c0d7784-e8d2-4b84-9332-185029fb41aa/volumes" Mar 10 15:34:15 crc kubenswrapper[4795]: I0310 15:34:15.476418 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:34:15 crc kubenswrapper[4795]: E0310 15:34:15.477124 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:34:29 crc kubenswrapper[4795]: I0310 15:34:29.478318 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:34:29 crc kubenswrapper[4795]: E0310 15:34:29.479496 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:34:43 crc kubenswrapper[4795]: I0310 15:34:43.225434 4795 scope.go:117] "RemoveContainer" containerID="1a9c9b167e823eb5c00e5d22c93b25d39a5e09ba225d873cf169298953242506" Mar 10 15:34:43 crc kubenswrapper[4795]: I0310 15:34:43.246425 4795 scope.go:117] "RemoveContainer" containerID="dd4a52e971bad8551d909acd49617832f3d17b518ec47eb1d3ef2ee1788128c5" Mar 10 15:34:43 crc kubenswrapper[4795]: I0310 15:34:43.476813 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:34:43 crc kubenswrapper[4795]: E0310 15:34:43.477248 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:34:57 crc kubenswrapper[4795]: I0310 15:34:57.484218 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:34:57 crc kubenswrapper[4795]: E0310 15:34:57.484945 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:35:08 crc kubenswrapper[4795]: I0310 15:35:08.476341 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:35:08 crc kubenswrapper[4795]: E0310 15:35:08.477147 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:35:19 crc kubenswrapper[4795]: I0310 15:35:19.477295 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:35:19 crc kubenswrapper[4795]: E0310 15:35:19.477970 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:35:25 crc kubenswrapper[4795]: I0310 15:35:25.039274 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-68gqp"] Mar 10 15:35:25 crc kubenswrapper[4795]: I0310 15:35:25.048432 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-68gqp"] Mar 10 15:35:25 crc kubenswrapper[4795]: I0310 15:35:25.487203 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3d684b-c185-4a2b-a739-a40d239bc661" path="/var/lib/kubelet/pods/4f3d684b-c185-4a2b-a739-a40d239bc661/volumes" Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.045723 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-wrt7x"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.056181 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-wrt7x"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.066165 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-751f-account-create-update-bngwt"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.076960 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-280d-account-create-update-l8cfw"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.086933 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-751f-account-create-update-bngwt"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.094117 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-280d-account-create-update-l8cfw"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.102000 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4nvx5"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.109765 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6b7b-account-create-update-2mtzv"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.117062 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4nvx5"] Mar 10 15:35:26 crc kubenswrapper[4795]: I0310 15:35:26.124596 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6b7b-account-create-update-2mtzv"] Mar 10 15:35:27 crc kubenswrapper[4795]: I0310 15:35:27.486031 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06792de7-f343-4a3f-ad0b-66577ab2aca6" path="/var/lib/kubelet/pods/06792de7-f343-4a3f-ad0b-66577ab2aca6/volumes" Mar 10 15:35:27 crc kubenswrapper[4795]: I0310 15:35:27.486883 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84099369-9f74-4a3a-baca-2a83bb833639" path="/var/lib/kubelet/pods/84099369-9f74-4a3a-baca-2a83bb833639/volumes" Mar 10 15:35:27 crc kubenswrapper[4795]: I0310 15:35:27.487743 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a52410d0-e4ef-4d3c-849f-be86e8e4333d" path="/var/lib/kubelet/pods/a52410d0-e4ef-4d3c-849f-be86e8e4333d/volumes" Mar 10 15:35:27 crc kubenswrapper[4795]: I0310 15:35:27.488277 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e420f1-ef1a-4069-b328-caa602467dff" path="/var/lib/kubelet/pods/d5e420f1-ef1a-4069-b328-caa602467dff/volumes" Mar 10 15:35:27 crc kubenswrapper[4795]: I0310 15:35:27.488807 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc47465-6390-4daf-981d-477efb47c502" path="/var/lib/kubelet/pods/dcc47465-6390-4daf-981d-477efb47c502/volumes" Mar 10 15:35:33 crc kubenswrapper[4795]: I0310 15:35:33.476292 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:35:33 crc kubenswrapper[4795]: E0310 15:35:33.477010 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:35:41 crc kubenswrapper[4795]: I0310 15:35:41.675636 4795 generic.go:334] "Generic (PLEG): container finished" podID="0d84f498-2364-4d50-8dfa-c49547c2e29a" containerID="06d93b83f3edaf3d0eeafd822451198ece4280f9f5999f9deaa3fdbb79c64210" exitCode=0 Mar 10 15:35:41 crc kubenswrapper[4795]: I0310 15:35:41.675717 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" event={"ID":"0d84f498-2364-4d50-8dfa-c49547c2e29a","Type":"ContainerDied","Data":"06d93b83f3edaf3d0eeafd822451198ece4280f9f5999f9deaa3fdbb79c64210"} Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.077963 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.173728 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory\") pod \"0d84f498-2364-4d50-8dfa-c49547c2e29a\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.173872 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam\") pod \"0d84f498-2364-4d50-8dfa-c49547c2e29a\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.173950 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpg75\" (UniqueName: \"kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75\") pod \"0d84f498-2364-4d50-8dfa-c49547c2e29a\" (UID: \"0d84f498-2364-4d50-8dfa-c49547c2e29a\") " Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.189457 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75" (OuterVolumeSpecName: "kube-api-access-wpg75") pod "0d84f498-2364-4d50-8dfa-c49547c2e29a" (UID: "0d84f498-2364-4d50-8dfa-c49547c2e29a"). InnerVolumeSpecName "kube-api-access-wpg75". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.213325 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory" (OuterVolumeSpecName: "inventory") pod "0d84f498-2364-4d50-8dfa-c49547c2e29a" (UID: "0d84f498-2364-4d50-8dfa-c49547c2e29a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.278004 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.278037 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpg75\" (UniqueName: \"kubernetes.io/projected/0d84f498-2364-4d50-8dfa-c49547c2e29a-kube-api-access-wpg75\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.296242 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0d84f498-2364-4d50-8dfa-c49547c2e29a" (UID: "0d84f498-2364-4d50-8dfa-c49547c2e29a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.342094 4795 scope.go:117] "RemoveContainer" containerID="88d260b87ff73e121e6f1c9a7909298430d8a87a7f2544d4f0366a925f2f6acd" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.363998 4795 scope.go:117] "RemoveContainer" containerID="b7de940adacd66cc1f65d87748aca94c04d53704d993b9ff4465b4e3b00aacb9" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.380173 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d84f498-2364-4d50-8dfa-c49547c2e29a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.384200 4795 scope.go:117] "RemoveContainer" containerID="60dda952fab40ff5b66b3707749e6cf3fc95a18b7aaa9beefb8ff20d28d595ed" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.403546 4795 scope.go:117] "RemoveContainer" containerID="3be0320b536cdf2bc0798c16517491d4a92ebac4beca8079c069ba5b805f8075" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.424119 4795 scope.go:117] "RemoveContainer" containerID="09744bb1fd442df92c8a1cb70ae01868e9b50693a0f91d0ef495f07c95af86b8" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.444343 4795 scope.go:117] "RemoveContainer" containerID="176afdd3bc3ab97589f8c5d1929bb8f28e42e3d0a14c757e17e490e339c9bee7" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.693264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" event={"ID":"0d84f498-2364-4d50-8dfa-c49547c2e29a","Type":"ContainerDied","Data":"7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9"} Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.693618 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ae47baa929cd672b7892e18b2d1b58b6d4a1d8bec94e560f2c58a9995fad6c9" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.693352 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-d992r" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.787826 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm"] Mar 10 15:35:43 crc kubenswrapper[4795]: E0310 15:35:43.788302 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0990c949-f10b-4a1c-8bae-62f0ea1c0605" containerName="oc" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.788324 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0990c949-f10b-4a1c-8bae-62f0ea1c0605" containerName="oc" Mar 10 15:35:43 crc kubenswrapper[4795]: E0310 15:35:43.788338 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d84f498-2364-4d50-8dfa-c49547c2e29a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.788346 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d84f498-2364-4d50-8dfa-c49547c2e29a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.788562 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0990c949-f10b-4a1c-8bae-62f0ea1c0605" containerName="oc" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.788590 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d84f498-2364-4d50-8dfa-c49547c2e29a" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.789263 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.791313 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.791567 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.791759 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.791904 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.797872 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm"] Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.889582 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.889683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.889715 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpm57\" (UniqueName: \"kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.992363 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.992531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.992571 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpm57\" (UniqueName: \"kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:43 crc kubenswrapper[4795]: I0310 15:35:43.999844 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.000496 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.016962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpm57\" (UniqueName: \"kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.050930 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jn6h5"] Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.059321 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jn6h5"] Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.104781 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.597610 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm"] Mar 10 15:35:44 crc kubenswrapper[4795]: I0310 15:35:44.703449 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" event={"ID":"40ff419c-9d5a-4d92-8bfe-40edd38f79ba","Type":"ContainerStarted","Data":"16802f51c508bbf150ef8a6c9297ca5862d281358930c0c663c0805de049f302"} Mar 10 15:35:45 crc kubenswrapper[4795]: I0310 15:35:45.491199 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80269433-7015-4f86-ac1e-81634478a4b4" path="/var/lib/kubelet/pods/80269433-7015-4f86-ac1e-81634478a4b4/volumes" Mar 10 15:35:45 crc kubenswrapper[4795]: I0310 15:35:45.716105 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" event={"ID":"40ff419c-9d5a-4d92-8bfe-40edd38f79ba","Type":"ContainerStarted","Data":"83c7e0a0715de8570ea3ea1589221847a3d64adf3015b04352ed75f7a0dceac9"} Mar 10 15:35:45 crc kubenswrapper[4795]: I0310 15:35:45.738428 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" podStartSLOduration=2.137025479 podStartE2EDuration="2.738407234s" podCreationTimestamp="2026-03-10 15:35:43 +0000 UTC" firstStartedPulling="2026-03-10 15:35:44.606528781 +0000 UTC m=+1777.772269679" lastFinishedPulling="2026-03-10 15:35:45.207910536 +0000 UTC m=+1778.373651434" observedRunningTime="2026-03-10 15:35:45.729527821 +0000 UTC m=+1778.895268719" watchObservedRunningTime="2026-03-10 15:35:45.738407234 +0000 UTC m=+1778.904148132" Mar 10 15:35:46 crc kubenswrapper[4795]: I0310 15:35:46.477044 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:35:46 crc kubenswrapper[4795]: E0310 15:35:46.477266 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:35:57 crc kubenswrapper[4795]: I0310 15:35:57.038355 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-hr6mw"] Mar 10 15:35:57 crc kubenswrapper[4795]: I0310 15:35:57.046868 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-hr6mw"] Mar 10 15:35:57 crc kubenswrapper[4795]: I0310 15:35:57.489326 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994" path="/var/lib/kubelet/pods/a2b2bb45-f3b1-4fd2-a2b1-2ba3a6a71994/volumes" Mar 10 15:35:59 crc kubenswrapper[4795]: I0310 15:35:59.476808 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:35:59 crc kubenswrapper[4795]: E0310 15:35:59.477366 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.139935 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552616-2sphr"] Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.141670 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.144821 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.144857 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.144930 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.150670 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-2sphr"] Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.185447 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v795n\" (UniqueName: \"kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n\") pod \"auto-csr-approver-29552616-2sphr\" (UID: \"91365c19-0f5e-41c5-b530-5aa4a7062f02\") " pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.288436 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v795n\" (UniqueName: \"kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n\") pod \"auto-csr-approver-29552616-2sphr\" (UID: \"91365c19-0f5e-41c5-b530-5aa4a7062f02\") " pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.308450 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v795n\" (UniqueName: \"kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n\") pod \"auto-csr-approver-29552616-2sphr\" (UID: \"91365c19-0f5e-41c5-b530-5aa4a7062f02\") " pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.461002 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:00 crc kubenswrapper[4795]: I0310 15:36:00.930881 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-2sphr"] Mar 10 15:36:00 crc kubenswrapper[4795]: W0310 15:36:00.933418 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91365c19_0f5e_41c5_b530_5aa4a7062f02.slice/crio-73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b WatchSource:0}: Error finding container 73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b: Status 404 returned error can't find the container with id 73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b Mar 10 15:36:01 crc kubenswrapper[4795]: I0310 15:36:01.861944 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-2sphr" event={"ID":"91365c19-0f5e-41c5-b530-5aa4a7062f02","Type":"ContainerStarted","Data":"73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b"} Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.038941 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-73be-account-create-update-pwvb8"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.049735 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-664d-account-create-update-f29p6"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.061198 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-d52c-account-create-update-d6hmx"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.072083 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-zf2n4"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.080278 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-664d-account-create-update-f29p6"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.087742 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-d52c-account-create-update-d6hmx"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.094185 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-zf2n4"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.100767 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8lgbz"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.107292 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-73be-account-create-update-pwvb8"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.114778 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8lgbz"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.121824 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-cwpm2"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.129239 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-cwpm2"] Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.876131 4795 generic.go:334] "Generic (PLEG): container finished" podID="91365c19-0f5e-41c5-b530-5aa4a7062f02" containerID="fb1285a2dcc580e8d90d3f4ddc8310487a11a86c3679d12cb616b1ce3b40fa01" exitCode=0 Mar 10 15:36:02 crc kubenswrapper[4795]: I0310 15:36:02.876174 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-2sphr" event={"ID":"91365c19-0f5e-41c5-b530-5aa4a7062f02","Type":"ContainerDied","Data":"fb1285a2dcc580e8d90d3f4ddc8310487a11a86c3679d12cb616b1ce3b40fa01"} Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.502431 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325f8d75-2010-44ed-a01a-954670df7e15" path="/var/lib/kubelet/pods/325f8d75-2010-44ed-a01a-954670df7e15/volumes" Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.503544 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="450e20be-0c76-4062-8e10-4a11808a9cca" path="/var/lib/kubelet/pods/450e20be-0c76-4062-8e10-4a11808a9cca/volumes" Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.504174 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c2d650a-a19d-4c82-a3fe-34850d8dbbc5" path="/var/lib/kubelet/pods/6c2d650a-a19d-4c82-a3fe-34850d8dbbc5/volumes" Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.504796 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="859fb054-ddc6-430b-b049-0571c3c57be3" path="/var/lib/kubelet/pods/859fb054-ddc6-430b-b049-0571c3c57be3/volumes" Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.506176 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2d3011-1ce1-43c4-b058-e2171446b079" path="/var/lib/kubelet/pods/9d2d3011-1ce1-43c4-b058-e2171446b079/volumes" Mar 10 15:36:03 crc kubenswrapper[4795]: I0310 15:36:03.506919 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c217cb85-4bff-4cc2-a554-8dd436e093b0" path="/var/lib/kubelet/pods/c217cb85-4bff-4cc2-a554-8dd436e093b0/volumes" Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.252490 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.370552 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v795n\" (UniqueName: \"kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n\") pod \"91365c19-0f5e-41c5-b530-5aa4a7062f02\" (UID: \"91365c19-0f5e-41c5-b530-5aa4a7062f02\") " Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.378952 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n" (OuterVolumeSpecName: "kube-api-access-v795n") pod "91365c19-0f5e-41c5-b530-5aa4a7062f02" (UID: "91365c19-0f5e-41c5-b530-5aa4a7062f02"). InnerVolumeSpecName "kube-api-access-v795n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.472693 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v795n\" (UniqueName: \"kubernetes.io/projected/91365c19-0f5e-41c5-b530-5aa4a7062f02-kube-api-access-v795n\") on node \"crc\" DevicePath \"\"" Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.895109 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552616-2sphr" event={"ID":"91365c19-0f5e-41c5-b530-5aa4a7062f02","Type":"ContainerDied","Data":"73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b"} Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.895154 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73c70c3d1691f16acaa088e38ef930c4f39746331cb56d2c66b15eda8ffd132b" Mar 10 15:36:04 crc kubenswrapper[4795]: I0310 15:36:04.895211 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552616-2sphr" Mar 10 15:36:05 crc kubenswrapper[4795]: I0310 15:36:05.306459 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-752kh"] Mar 10 15:36:05 crc kubenswrapper[4795]: I0310 15:36:05.316827 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552610-752kh"] Mar 10 15:36:05 crc kubenswrapper[4795]: I0310 15:36:05.487552 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5" path="/var/lib/kubelet/pods/2e45ef2b-0fcc-4bde-b4ca-a9ded60343c5/volumes" Mar 10 15:36:10 crc kubenswrapper[4795]: I0310 15:36:10.477359 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:36:10 crc kubenswrapper[4795]: E0310 15:36:10.478253 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:36:11 crc kubenswrapper[4795]: I0310 15:36:11.029791 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-ksntn"] Mar 10 15:36:11 crc kubenswrapper[4795]: I0310 15:36:11.039899 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-ksntn"] Mar 10 15:36:11 crc kubenswrapper[4795]: I0310 15:36:11.488827 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5" path="/var/lib/kubelet/pods/948aa5b8-ffc4-4a79-acf6-c7b5f2bb9cb5/volumes" Mar 10 15:36:23 crc kubenswrapper[4795]: I0310 15:36:23.477095 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:36:23 crc kubenswrapper[4795]: E0310 15:36:23.477861 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:36:38 crc kubenswrapper[4795]: I0310 15:36:38.044649 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-tk68s"] Mar 10 15:36:38 crc kubenswrapper[4795]: I0310 15:36:38.053828 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-tk68s"] Mar 10 15:36:38 crc kubenswrapper[4795]: I0310 15:36:38.476012 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:36:38 crc kubenswrapper[4795]: E0310 15:36:38.476552 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:36:39 crc kubenswrapper[4795]: I0310 15:36:39.490060 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07ac8661-08c8-49aa-ae40-cf472895a954" path="/var/lib/kubelet/pods/07ac8661-08c8-49aa-ae40-cf472895a954/volumes" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.572659 4795 scope.go:117] "RemoveContainer" containerID="94143bbcaa5dbca085364e324ff835b43fdff929594d46a37cee4aa3fa2dce9d" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.627458 4795 scope.go:117] "RemoveContainer" containerID="1ee644d2b01d0d8f676164c61c90a5e0951443a0f999632dc3cd42f51fb7ad22" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.709832 4795 scope.go:117] "RemoveContainer" containerID="5672b8da337c8b685d6738d823f5cc15264092331dea35e99cd89ddfd277abcf" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.755918 4795 scope.go:117] "RemoveContainer" containerID="c4fc693cc16df99231070db18f9638dcba4cd78e806c3601418f73c2e0dbed82" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.776182 4795 scope.go:117] "RemoveContainer" containerID="f7460ed6807bd1db5c1d96184e08d131859e20f5d1c897feef35c304b1182395" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.821894 4795 scope.go:117] "RemoveContainer" containerID="08b9db2fb21ce2d930ad792b63000711fa2bb8a01f209b7ac7a8b780cb053953" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.877979 4795 scope.go:117] "RemoveContainer" containerID="4ba85f207212863dae9dfe324ff3534baf611b9fb69980bbcb335b5e6241edc9" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.924980 4795 scope.go:117] "RemoveContainer" containerID="5ea8414e86e0e61cabdd23bdc8e027670f375ece690a50e7040aa7e0d6e6ab41" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.951503 4795 scope.go:117] "RemoveContainer" containerID="37f7be6671ae9a44c0e7b4d21139278390629316894e34a28da6559207ece2e3" Mar 10 15:36:43 crc kubenswrapper[4795]: I0310 15:36:43.987134 4795 scope.go:117] "RemoveContainer" containerID="73e3d665d4e95bccb029e0366456de02da3d428fe7eb1c17ddf7df05d991dd13" Mar 10 15:36:44 crc kubenswrapper[4795]: I0310 15:36:44.033128 4795 scope.go:117] "RemoveContainer" containerID="57aa263732b323e9ae994e5fec3e55a8b13ef707bf2da626a94b6f6fd611c5cf" Mar 10 15:36:45 crc kubenswrapper[4795]: I0310 15:36:45.058046 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-t2rg9"] Mar 10 15:36:45 crc kubenswrapper[4795]: I0310 15:36:45.065625 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-t2rg9"] Mar 10 15:36:45 crc kubenswrapper[4795]: I0310 15:36:45.500662 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f93d5e00-4866-406a-ad39-c3ab0b2156b0" path="/var/lib/kubelet/pods/f93d5e00-4866-406a-ad39-c3ab0b2156b0/volumes" Mar 10 15:36:50 crc kubenswrapper[4795]: I0310 15:36:50.036802 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-9wkrh"] Mar 10 15:36:50 crc kubenswrapper[4795]: I0310 15:36:50.045750 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-9ktzf"] Mar 10 15:36:50 crc kubenswrapper[4795]: I0310 15:36:50.053784 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-9wkrh"] Mar 10 15:36:50 crc kubenswrapper[4795]: I0310 15:36:50.061779 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-9ktzf"] Mar 10 15:36:51 crc kubenswrapper[4795]: I0310 15:36:51.476855 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:36:51 crc kubenswrapper[4795]: E0310 15:36:51.477186 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:36:51 crc kubenswrapper[4795]: I0310 15:36:51.485996 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e5de16-defe-4daa-94cc-3d50e3461dbd" path="/var/lib/kubelet/pods/e3e5de16-defe-4daa-94cc-3d50e3461dbd/volumes" Mar 10 15:36:51 crc kubenswrapper[4795]: I0310 15:36:51.486554 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c3084a-a7d5-4703-836b-951571462fee" path="/var/lib/kubelet/pods/f9c3084a-a7d5-4703-836b-951571462fee/volumes" Mar 10 15:36:55 crc kubenswrapper[4795]: I0310 15:36:55.364245 4795 generic.go:334] "Generic (PLEG): container finished" podID="40ff419c-9d5a-4d92-8bfe-40edd38f79ba" containerID="83c7e0a0715de8570ea3ea1589221847a3d64adf3015b04352ed75f7a0dceac9" exitCode=0 Mar 10 15:36:55 crc kubenswrapper[4795]: I0310 15:36:55.364384 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" event={"ID":"40ff419c-9d5a-4d92-8bfe-40edd38f79ba","Type":"ContainerDied","Data":"83c7e0a0715de8570ea3ea1589221847a3d64adf3015b04352ed75f7a0dceac9"} Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.797445 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.837055 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory\") pod \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.837154 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpm57\" (UniqueName: \"kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57\") pod \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.837190 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam\") pod \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\" (UID: \"40ff419c-9d5a-4d92-8bfe-40edd38f79ba\") " Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.842719 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57" (OuterVolumeSpecName: "kube-api-access-bpm57") pod "40ff419c-9d5a-4d92-8bfe-40edd38f79ba" (UID: "40ff419c-9d5a-4d92-8bfe-40edd38f79ba"). InnerVolumeSpecName "kube-api-access-bpm57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.865023 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory" (OuterVolumeSpecName: "inventory") pod "40ff419c-9d5a-4d92-8bfe-40edd38f79ba" (UID: "40ff419c-9d5a-4d92-8bfe-40edd38f79ba"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.871844 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "40ff419c-9d5a-4d92-8bfe-40edd38f79ba" (UID: "40ff419c-9d5a-4d92-8bfe-40edd38f79ba"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.939561 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.939599 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpm57\" (UniqueName: \"kubernetes.io/projected/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-kube-api-access-bpm57\") on node \"crc\" DevicePath \"\"" Mar 10 15:36:56 crc kubenswrapper[4795]: I0310 15:36:56.939610 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/40ff419c-9d5a-4d92-8bfe-40edd38f79ba-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.383291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" event={"ID":"40ff419c-9d5a-4d92-8bfe-40edd38f79ba","Type":"ContainerDied","Data":"16802f51c508bbf150ef8a6c9297ca5862d281358930c0c663c0805de049f302"} Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.383657 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16802f51c508bbf150ef8a6c9297ca5862d281358930c0c663c0805de049f302" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.383363 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.473600 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2"] Mar 10 15:36:57 crc kubenswrapper[4795]: E0310 15:36:57.474233 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40ff419c-9d5a-4d92-8bfe-40edd38f79ba" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.474258 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="40ff419c-9d5a-4d92-8bfe-40edd38f79ba" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:36:57 crc kubenswrapper[4795]: E0310 15:36:57.474299 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91365c19-0f5e-41c5-b530-5aa4a7062f02" containerName="oc" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.474310 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91365c19-0f5e-41c5-b530-5aa4a7062f02" containerName="oc" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.474601 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="91365c19-0f5e-41c5-b530-5aa4a7062f02" containerName="oc" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.474650 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="40ff419c-9d5a-4d92-8bfe-40edd38f79ba" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.475553 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.483632 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.483800 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.484769 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.484923 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.494515 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2"] Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.551835 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.551896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrw6n\" (UniqueName: \"kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.552284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.653718 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.653793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.653822 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrw6n\" (UniqueName: \"kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.662123 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.662174 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.670497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrw6n\" (UniqueName: \"kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:57 crc kubenswrapper[4795]: I0310 15:36:57.820299 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:36:58 crc kubenswrapper[4795]: I0310 15:36:58.352811 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2"] Mar 10 15:36:58 crc kubenswrapper[4795]: I0310 15:36:58.356945 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:36:58 crc kubenswrapper[4795]: I0310 15:36:58.393840 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" event={"ID":"c877826d-0c20-4e0d-b4b9-e11b301c36d3","Type":"ContainerStarted","Data":"02408f127b2ff3ea231d672c03f64eccb0d40d19344c9d0f525ba8770f3be7f6"} Mar 10 15:36:59 crc kubenswrapper[4795]: I0310 15:36:59.401956 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" event={"ID":"c877826d-0c20-4e0d-b4b9-e11b301c36d3","Type":"ContainerStarted","Data":"c0272381f53f70eda804264754645daedd5e30bba65542e333203d4775eebca6"} Mar 10 15:36:59 crc kubenswrapper[4795]: I0310 15:36:59.417774 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" podStartSLOduration=1.600974981 podStartE2EDuration="2.417758744s" podCreationTimestamp="2026-03-10 15:36:57 +0000 UTC" firstStartedPulling="2026-03-10 15:36:58.356708907 +0000 UTC m=+1851.522449805" lastFinishedPulling="2026-03-10 15:36:59.17349267 +0000 UTC m=+1852.339233568" observedRunningTime="2026-03-10 15:36:59.416100177 +0000 UTC m=+1852.581841075" watchObservedRunningTime="2026-03-10 15:36:59.417758744 +0000 UTC m=+1852.583499632" Mar 10 15:37:02 crc kubenswrapper[4795]: I0310 15:37:02.478650 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:37:02 crc kubenswrapper[4795]: E0310 15:37:02.479673 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:37:03 crc kubenswrapper[4795]: I0310 15:37:03.032662 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mg79m"] Mar 10 15:37:03 crc kubenswrapper[4795]: I0310 15:37:03.043029 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mg79m"] Mar 10 15:37:03 crc kubenswrapper[4795]: I0310 15:37:03.488217 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2" path="/var/lib/kubelet/pods/7d93f83f-7b3c-45d8-bd1a-8748ac0cd2d2/volumes" Mar 10 15:37:04 crc kubenswrapper[4795]: I0310 15:37:04.451130 4795 generic.go:334] "Generic (PLEG): container finished" podID="c877826d-0c20-4e0d-b4b9-e11b301c36d3" containerID="c0272381f53f70eda804264754645daedd5e30bba65542e333203d4775eebca6" exitCode=0 Mar 10 15:37:04 crc kubenswrapper[4795]: I0310 15:37:04.451175 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" event={"ID":"c877826d-0c20-4e0d-b4b9-e11b301c36d3","Type":"ContainerDied","Data":"c0272381f53f70eda804264754645daedd5e30bba65542e333203d4775eebca6"} Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.841103 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.913520 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory\") pod \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.913592 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrw6n\" (UniqueName: \"kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n\") pod \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.913662 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam\") pod \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\" (UID: \"c877826d-0c20-4e0d-b4b9-e11b301c36d3\") " Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.922299 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n" (OuterVolumeSpecName: "kube-api-access-jrw6n") pod "c877826d-0c20-4e0d-b4b9-e11b301c36d3" (UID: "c877826d-0c20-4e0d-b4b9-e11b301c36d3"). InnerVolumeSpecName "kube-api-access-jrw6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.947573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory" (OuterVolumeSpecName: "inventory") pod "c877826d-0c20-4e0d-b4b9-e11b301c36d3" (UID: "c877826d-0c20-4e0d-b4b9-e11b301c36d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:05 crc kubenswrapper[4795]: I0310 15:37:05.948836 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c877826d-0c20-4e0d-b4b9-e11b301c36d3" (UID: "c877826d-0c20-4e0d-b4b9-e11b301c36d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.015771 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.016054 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrw6n\" (UniqueName: \"kubernetes.io/projected/c877826d-0c20-4e0d-b4b9-e11b301c36d3-kube-api-access-jrw6n\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.016145 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c877826d-0c20-4e0d-b4b9-e11b301c36d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.468656 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" event={"ID":"c877826d-0c20-4e0d-b4b9-e11b301c36d3","Type":"ContainerDied","Data":"02408f127b2ff3ea231d672c03f64eccb0d40d19344c9d0f525ba8770f3be7f6"} Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.468975 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02408f127b2ff3ea231d672c03f64eccb0d40d19344c9d0f525ba8770f3be7f6" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.468742 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.542004 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99"] Mar 10 15:37:06 crc kubenswrapper[4795]: E0310 15:37:06.544401 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c877826d-0c20-4e0d-b4b9-e11b301c36d3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.544425 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c877826d-0c20-4e0d-b4b9-e11b301c36d3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.544711 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c877826d-0c20-4e0d-b4b9-e11b301c36d3" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.545320 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.549409 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.549764 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.550010 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.550169 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.555177 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99"] Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.626460 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrjg6\" (UniqueName: \"kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.626589 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.626622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.728894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.729012 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.729250 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrjg6\" (UniqueName: \"kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.733336 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.735719 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.745276 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrjg6\" (UniqueName: \"kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-snr99\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:06 crc kubenswrapper[4795]: I0310 15:37:06.861746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:07 crc kubenswrapper[4795]: I0310 15:37:07.353063 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99"] Mar 10 15:37:07 crc kubenswrapper[4795]: I0310 15:37:07.499097 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" event={"ID":"f7025f5a-4547-4d24-8e11-bb54a9ff4311","Type":"ContainerStarted","Data":"48032f17291bdedf84245f828255cf8798f2ff642e9273171e92a4643663c956"} Mar 10 15:37:08 crc kubenswrapper[4795]: I0310 15:37:08.962944 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:37:09 crc kubenswrapper[4795]: I0310 15:37:09.502419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" event={"ID":"f7025f5a-4547-4d24-8e11-bb54a9ff4311","Type":"ContainerStarted","Data":"338b04783e1e9ea79a40ddf74059a46423ee03e6e173d6e23aa3ce591d4717ae"} Mar 10 15:37:09 crc kubenswrapper[4795]: I0310 15:37:09.530155 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" podStartSLOduration=1.932692098 podStartE2EDuration="3.530137253s" podCreationTimestamp="2026-03-10 15:37:06 +0000 UTC" firstStartedPulling="2026-03-10 15:37:07.362764562 +0000 UTC m=+1860.528505460" lastFinishedPulling="2026-03-10 15:37:08.960209707 +0000 UTC m=+1862.125950615" observedRunningTime="2026-03-10 15:37:09.52375599 +0000 UTC m=+1862.689496888" watchObservedRunningTime="2026-03-10 15:37:09.530137253 +0000 UTC m=+1862.695878141" Mar 10 15:37:13 crc kubenswrapper[4795]: I0310 15:37:13.476853 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:37:13 crc kubenswrapper[4795]: E0310 15:37:13.477520 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:37:28 crc kubenswrapper[4795]: I0310 15:37:28.477002 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:37:28 crc kubenswrapper[4795]: E0310 15:37:28.477810 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:37:39 crc kubenswrapper[4795]: I0310 15:37:39.476810 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:37:39 crc kubenswrapper[4795]: E0310 15:37:39.477580 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.050096 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ede0-account-create-update-s67gw"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.060511 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-e8a3-account-create-update-lzpjf"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.072819 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-4eb7-account-create-update-77mdf"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.083452 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9gzkw"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.089617 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-nmqx6"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.097362 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ede0-account-create-update-s67gw"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.104714 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-e8a3-account-create-update-lzpjf"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.112597 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9gzkw"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.120216 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-7t4xc"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.127915 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-4eb7-account-create-update-77mdf"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.135135 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-nmqx6"] Mar 10 15:37:40 crc kubenswrapper[4795]: I0310 15:37:40.141840 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-7t4xc"] Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.493157 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1de929ea-a6bc-48b9-9254-f0eaa6a73f36" path="/var/lib/kubelet/pods/1de929ea-a6bc-48b9-9254-f0eaa6a73f36/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.494581 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a4dce0f-e1c8-434c-b513-0f6f86d89099" path="/var/lib/kubelet/pods/2a4dce0f-e1c8-434c-b513-0f6f86d89099/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.495974 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5678c7de-b0fe-4d07-a752-1cd7eea46db6" path="/var/lib/kubelet/pods/5678c7de-b0fe-4d07-a752-1cd7eea46db6/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.497334 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82d29f02-6127-4a6a-8df2-b541ce3ee733" path="/var/lib/kubelet/pods/82d29f02-6127-4a6a-8df2-b541ce3ee733/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.500122 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8412ff64-8bdb-4af0-a844-997ad007635e" path="/var/lib/kubelet/pods/8412ff64-8bdb-4af0-a844-997ad007635e/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.501593 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3002df7-165d-4f82-9e02-4d48bf960c87" path="/var/lib/kubelet/pods/e3002df7-165d-4f82-9e02-4d48bf960c87/volumes" Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.824217 4795 generic.go:334] "Generic (PLEG): container finished" podID="f7025f5a-4547-4d24-8e11-bb54a9ff4311" containerID="338b04783e1e9ea79a40ddf74059a46423ee03e6e173d6e23aa3ce591d4717ae" exitCode=0 Mar 10 15:37:41 crc kubenswrapper[4795]: I0310 15:37:41.824286 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" event={"ID":"f7025f5a-4547-4d24-8e11-bb54a9ff4311","Type":"ContainerDied","Data":"338b04783e1e9ea79a40ddf74059a46423ee03e6e173d6e23aa3ce591d4717ae"} Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.333845 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.446467 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory\") pod \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.446623 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrjg6\" (UniqueName: \"kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6\") pod \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.446656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam\") pod \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\" (UID: \"f7025f5a-4547-4d24-8e11-bb54a9ff4311\") " Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.455415 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6" (OuterVolumeSpecName: "kube-api-access-mrjg6") pod "f7025f5a-4547-4d24-8e11-bb54a9ff4311" (UID: "f7025f5a-4547-4d24-8e11-bb54a9ff4311"). InnerVolumeSpecName "kube-api-access-mrjg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.473823 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7025f5a-4547-4d24-8e11-bb54a9ff4311" (UID: "f7025f5a-4547-4d24-8e11-bb54a9ff4311"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.480297 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory" (OuterVolumeSpecName: "inventory") pod "f7025f5a-4547-4d24-8e11-bb54a9ff4311" (UID: "f7025f5a-4547-4d24-8e11-bb54a9ff4311"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.553468 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.553510 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrjg6\" (UniqueName: \"kubernetes.io/projected/f7025f5a-4547-4d24-8e11-bb54a9ff4311-kube-api-access-mrjg6\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.553530 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7025f5a-4547-4d24-8e11-bb54a9ff4311-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.847725 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" event={"ID":"f7025f5a-4547-4d24-8e11-bb54a9ff4311","Type":"ContainerDied","Data":"48032f17291bdedf84245f828255cf8798f2ff642e9273171e92a4643663c956"} Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.847764 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48032f17291bdedf84245f828255cf8798f2ff642e9273171e92a4643663c956" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.847782 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-snr99" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.931882 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt"] Mar 10 15:37:43 crc kubenswrapper[4795]: E0310 15:37:43.932576 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7025f5a-4547-4d24-8e11-bb54a9ff4311" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.932611 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7025f5a-4547-4d24-8e11-bb54a9ff4311" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.933025 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7025f5a-4547-4d24-8e11-bb54a9ff4311" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.934035 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.936477 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.937033 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.937378 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.937604 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.950255 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt"] Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.959056 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xljxm\" (UniqueName: \"kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.959200 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:43 crc kubenswrapper[4795]: I0310 15:37:43.959284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.060878 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.061322 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xljxm\" (UniqueName: \"kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.061359 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.066052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.075180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.077706 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xljxm\" (UniqueName: \"kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.263027 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.321015 4795 scope.go:117] "RemoveContainer" containerID="c38a22ec55758969bb09dc77f11e0196b3242889a6387f045d0477d331c266d0" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.424786 4795 scope.go:117] "RemoveContainer" containerID="aa96d020ad3adf8993443396071d816ecf85b246fd51593f4c37ce1959b38d13" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.460052 4795 scope.go:117] "RemoveContainer" containerID="9ba9cb0a8abffd4cae87b0d21779ed81e17831e82002891cd86da8d3d80b74cb" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.532736 4795 scope.go:117] "RemoveContainer" containerID="46f07e238062e9e174acd6f8104a4bf7a444b7d87dd9aced24ab8acbdfaea4e0" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.579861 4795 scope.go:117] "RemoveContainer" containerID="7aabf5d7cb7726116bb126dfadad56556c53264e0490c57ab36ce5b890be4f26" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.621173 4795 scope.go:117] "RemoveContainer" containerID="442bea24fb4a81cd3787c64bf697eadca18160e3b23da0874fdb9689e66848b2" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.650612 4795 scope.go:117] "RemoveContainer" containerID="c466bef69580a6a3e3583f4e350912746cd8d3dccfadd079be646071b86b4f24" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.670306 4795 scope.go:117] "RemoveContainer" containerID="3a5b205df7ef38a505f95a2d8756c9605b99541bcd1f12386901c56b5ff70611" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.688684 4795 scope.go:117] "RemoveContainer" containerID="2d097c82909a3759221c437fade51ca1cee2763b48aca5349f14ab15230759bd" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.713687 4795 scope.go:117] "RemoveContainer" containerID="4d2bb03ea1839fd98374fa87dd1f961fae081be8e384b53ef71953122c88e056" Mar 10 15:37:44 crc kubenswrapper[4795]: I0310 15:37:44.850220 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt"] Mar 10 15:37:45 crc kubenswrapper[4795]: I0310 15:37:45.885716 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" event={"ID":"9b45b478-8da9-468e-94d0-c6eca284ef60","Type":"ContainerStarted","Data":"d273f5ed3ce5f2afa9af9dd79794f1013802d8e331bca78e795673f9ca8fe00b"} Mar 10 15:37:45 crc kubenswrapper[4795]: I0310 15:37:45.886284 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" event={"ID":"9b45b478-8da9-468e-94d0-c6eca284ef60","Type":"ContainerStarted","Data":"5a0f236615aebee3825387d31b4e3ebae70220b7f5871fc1aee4b649fbd50ce7"} Mar 10 15:37:45 crc kubenswrapper[4795]: I0310 15:37:45.917208 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" podStartSLOduration=2.483225639 podStartE2EDuration="2.917183497s" podCreationTimestamp="2026-03-10 15:37:43 +0000 UTC" firstStartedPulling="2026-03-10 15:37:44.864915911 +0000 UTC m=+1898.030656809" lastFinishedPulling="2026-03-10 15:37:45.298873769 +0000 UTC m=+1898.464614667" observedRunningTime="2026-03-10 15:37:45.905444881 +0000 UTC m=+1899.071185779" watchObservedRunningTime="2026-03-10 15:37:45.917183497 +0000 UTC m=+1899.082924385" Mar 10 15:37:50 crc kubenswrapper[4795]: I0310 15:37:50.477249 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:37:50 crc kubenswrapper[4795]: E0310 15:37:50.479105 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.144031 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552618-2s9m9"] Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.145673 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.148074 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.148606 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.148844 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.157592 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-2s9m9"] Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.224727 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl6lj\" (UniqueName: \"kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj\") pod \"auto-csr-approver-29552618-2s9m9\" (UID: \"794fab49-0b94-4e9d-b291-ae2a4654f7fe\") " pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.326001 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl6lj\" (UniqueName: \"kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj\") pod \"auto-csr-approver-29552618-2s9m9\" (UID: \"794fab49-0b94-4e9d-b291-ae2a4654f7fe\") " pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.346312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl6lj\" (UniqueName: \"kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj\") pod \"auto-csr-approver-29552618-2s9m9\" (UID: \"794fab49-0b94-4e9d-b291-ae2a4654f7fe\") " pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.493545 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:00 crc kubenswrapper[4795]: I0310 15:38:00.966963 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-2s9m9"] Mar 10 15:38:00 crc kubenswrapper[4795]: W0310 15:38:00.967314 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod794fab49_0b94_4e9d_b291_ae2a4654f7fe.slice/crio-70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02 WatchSource:0}: Error finding container 70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02: Status 404 returned error can't find the container with id 70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02 Mar 10 15:38:01 crc kubenswrapper[4795]: I0310 15:38:01.031392 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" event={"ID":"794fab49-0b94-4e9d-b291-ae2a4654f7fe","Type":"ContainerStarted","Data":"70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02"} Mar 10 15:38:03 crc kubenswrapper[4795]: I0310 15:38:03.047480 4795 generic.go:334] "Generic (PLEG): container finished" podID="794fab49-0b94-4e9d-b291-ae2a4654f7fe" containerID="26140592fc8d7e2848b169053a3d627cb2a36f7dee16b177256cc7b0cb73736b" exitCode=0 Mar 10 15:38:03 crc kubenswrapper[4795]: I0310 15:38:03.047572 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" event={"ID":"794fab49-0b94-4e9d-b291-ae2a4654f7fe","Type":"ContainerDied","Data":"26140592fc8d7e2848b169053a3d627cb2a36f7dee16b177256cc7b0cb73736b"} Mar 10 15:38:04 crc kubenswrapper[4795]: I0310 15:38:04.380700 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:04 crc kubenswrapper[4795]: I0310 15:38:04.410180 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kl6lj\" (UniqueName: \"kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj\") pod \"794fab49-0b94-4e9d-b291-ae2a4654f7fe\" (UID: \"794fab49-0b94-4e9d-b291-ae2a4654f7fe\") " Mar 10 15:38:04 crc kubenswrapper[4795]: I0310 15:38:04.417669 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj" (OuterVolumeSpecName: "kube-api-access-kl6lj") pod "794fab49-0b94-4e9d-b291-ae2a4654f7fe" (UID: "794fab49-0b94-4e9d-b291-ae2a4654f7fe"). InnerVolumeSpecName "kube-api-access-kl6lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:04 crc kubenswrapper[4795]: I0310 15:38:04.512981 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kl6lj\" (UniqueName: \"kubernetes.io/projected/794fab49-0b94-4e9d-b291-ae2a4654f7fe-kube-api-access-kl6lj\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.069616 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" event={"ID":"794fab49-0b94-4e9d-b291-ae2a4654f7fe","Type":"ContainerDied","Data":"70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02"} Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.069671 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70fe6bef793c1198831c1c2b18fa22ba3f7ff0be45a65baf6ad3c23d0409cf02" Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.069687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552618-2s9m9" Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.458232 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-xxx2v"] Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.466607 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552612-xxx2v"] Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.477697 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:38:05 crc kubenswrapper[4795]: E0310 15:38:05.478194 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:38:05 crc kubenswrapper[4795]: I0310 15:38:05.487479 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a771fc14-3d9d-440c-a4e6-7e02569919c1" path="/var/lib/kubelet/pods/a771fc14-3d9d-440c-a4e6-7e02569919c1/volumes" Mar 10 15:38:08 crc kubenswrapper[4795]: I0310 15:38:08.027603 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ldnh5"] Mar 10 15:38:08 crc kubenswrapper[4795]: I0310 15:38:08.038747 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-ldnh5"] Mar 10 15:38:09 crc kubenswrapper[4795]: I0310 15:38:09.486876 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e6445e9-4e77-48dd-8550-e4068a6d9db2" path="/var/lib/kubelet/pods/7e6445e9-4e77-48dd-8550-e4068a6d9db2/volumes" Mar 10 15:38:18 crc kubenswrapper[4795]: I0310 15:38:18.476156 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:38:18 crc kubenswrapper[4795]: E0310 15:38:18.477152 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:38:26 crc kubenswrapper[4795]: I0310 15:38:26.051637 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-rfcwp"] Mar 10 15:38:26 crc kubenswrapper[4795]: I0310 15:38:26.058616 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-rfcwp"] Mar 10 15:38:27 crc kubenswrapper[4795]: I0310 15:38:27.039017 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkhtd"] Mar 10 15:38:27 crc kubenswrapper[4795]: I0310 15:38:27.056580 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-tkhtd"] Mar 10 15:38:27 crc kubenswrapper[4795]: I0310 15:38:27.515560 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="264227be-e2df-4b25-bfff-14226b9f6703" path="/var/lib/kubelet/pods/264227be-e2df-4b25-bfff-14226b9f6703/volumes" Mar 10 15:38:27 crc kubenswrapper[4795]: I0310 15:38:27.524386 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a0ed63-687e-4e41-8c99-e299cb991e17" path="/var/lib/kubelet/pods/d1a0ed63-687e-4e41-8c99-e299cb991e17/volumes" Mar 10 15:38:28 crc kubenswrapper[4795]: I0310 15:38:28.284890 4795 generic.go:334] "Generic (PLEG): container finished" podID="9b45b478-8da9-468e-94d0-c6eca284ef60" containerID="d273f5ed3ce5f2afa9af9dd79794f1013802d8e331bca78e795673f9ca8fe00b" exitCode=0 Mar 10 15:38:28 crc kubenswrapper[4795]: I0310 15:38:28.285881 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" event={"ID":"9b45b478-8da9-468e-94d0-c6eca284ef60","Type":"ContainerDied","Data":"d273f5ed3ce5f2afa9af9dd79794f1013802d8e331bca78e795673f9ca8fe00b"} Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.717774 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.888869 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory\") pod \"9b45b478-8da9-468e-94d0-c6eca284ef60\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.889021 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam\") pod \"9b45b478-8da9-468e-94d0-c6eca284ef60\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.889389 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xljxm\" (UniqueName: \"kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm\") pod \"9b45b478-8da9-468e-94d0-c6eca284ef60\" (UID: \"9b45b478-8da9-468e-94d0-c6eca284ef60\") " Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.896632 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm" (OuterVolumeSpecName: "kube-api-access-xljxm") pod "9b45b478-8da9-468e-94d0-c6eca284ef60" (UID: "9b45b478-8da9-468e-94d0-c6eca284ef60"). InnerVolumeSpecName "kube-api-access-xljxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.920347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9b45b478-8da9-468e-94d0-c6eca284ef60" (UID: "9b45b478-8da9-468e-94d0-c6eca284ef60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.922401 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory" (OuterVolumeSpecName: "inventory") pod "9b45b478-8da9-468e-94d0-c6eca284ef60" (UID: "9b45b478-8da9-468e-94d0-c6eca284ef60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.991429 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.991464 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9b45b478-8da9-468e-94d0-c6eca284ef60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:29 crc kubenswrapper[4795]: I0310 15:38:29.991474 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xljxm\" (UniqueName: \"kubernetes.io/projected/9b45b478-8da9-468e-94d0-c6eca284ef60-kube-api-access-xljxm\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.315099 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" event={"ID":"9b45b478-8da9-468e-94d0-c6eca284ef60","Type":"ContainerDied","Data":"5a0f236615aebee3825387d31b4e3ebae70220b7f5871fc1aee4b649fbd50ce7"} Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.315162 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0f236615aebee3825387d31b4e3ebae70220b7f5871fc1aee4b649fbd50ce7" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.315192 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.409036 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k2fn2"] Mar 10 15:38:30 crc kubenswrapper[4795]: E0310 15:38:30.409513 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b45b478-8da9-468e-94d0-c6eca284ef60" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.409538 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b45b478-8da9-468e-94d0-c6eca284ef60" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:30 crc kubenswrapper[4795]: E0310 15:38:30.409560 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794fab49-0b94-4e9d-b291-ae2a4654f7fe" containerName="oc" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.409570 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="794fab49-0b94-4e9d-b291-ae2a4654f7fe" containerName="oc" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.409787 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="794fab49-0b94-4e9d-b291-ae2a4654f7fe" containerName="oc" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.409814 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b45b478-8da9-468e-94d0-c6eca284ef60" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.410648 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.417815 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.418495 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.418628 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.418649 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.421579 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k2fn2"] Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.602735 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrfk\" (UniqueName: \"kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.603037 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.603450 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.705181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.705248 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrfk\" (UniqueName: \"kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.705291 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.710334 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.711228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.724279 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrfk\" (UniqueName: \"kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk\") pod \"ssh-known-hosts-edpm-deployment-k2fn2\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:30 crc kubenswrapper[4795]: I0310 15:38:30.736226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:31 crc kubenswrapper[4795]: I0310 15:38:31.258492 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k2fn2"] Mar 10 15:38:31 crc kubenswrapper[4795]: I0310 15:38:31.324444 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" event={"ID":"def1f0a0-23d4-496c-b75f-037e2666d444","Type":"ContainerStarted","Data":"61edfe00f23c16bf0c8d0a720d0764e3c86e8a84e1e5f493bcb5ed30eb5768e3"} Mar 10 15:38:32 crc kubenswrapper[4795]: I0310 15:38:32.339928 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" event={"ID":"def1f0a0-23d4-496c-b75f-037e2666d444","Type":"ContainerStarted","Data":"cd4db3d84db99cf07937b36940e2e9adf2784dc5c9e4284778f301a9b7b555f0"} Mar 10 15:38:32 crc kubenswrapper[4795]: I0310 15:38:32.372097 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" podStartSLOduration=1.946887363 podStartE2EDuration="2.372059995s" podCreationTimestamp="2026-03-10 15:38:30 +0000 UTC" firstStartedPulling="2026-03-10 15:38:31.269927663 +0000 UTC m=+1944.435668561" lastFinishedPulling="2026-03-10 15:38:31.695100295 +0000 UTC m=+1944.860841193" observedRunningTime="2026-03-10 15:38:32.362962646 +0000 UTC m=+1945.528703544" watchObservedRunningTime="2026-03-10 15:38:32.372059995 +0000 UTC m=+1945.537800893" Mar 10 15:38:33 crc kubenswrapper[4795]: I0310 15:38:33.476678 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:38:34 crc kubenswrapper[4795]: I0310 15:38:34.364792 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5"} Mar 10 15:38:38 crc kubenswrapper[4795]: I0310 15:38:38.405682 4795 generic.go:334] "Generic (PLEG): container finished" podID="def1f0a0-23d4-496c-b75f-037e2666d444" containerID="cd4db3d84db99cf07937b36940e2e9adf2784dc5c9e4284778f301a9b7b555f0" exitCode=0 Mar 10 15:38:38 crc kubenswrapper[4795]: I0310 15:38:38.405772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" event={"ID":"def1f0a0-23d4-496c-b75f-037e2666d444","Type":"ContainerDied","Data":"cd4db3d84db99cf07937b36940e2e9adf2784dc5c9e4284778f301a9b7b555f0"} Mar 10 15:38:39 crc kubenswrapper[4795]: I0310 15:38:39.853042 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.001131 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdrfk\" (UniqueName: \"kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk\") pod \"def1f0a0-23d4-496c-b75f-037e2666d444\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.001488 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0\") pod \"def1f0a0-23d4-496c-b75f-037e2666d444\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.001522 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam\") pod \"def1f0a0-23d4-496c-b75f-037e2666d444\" (UID: \"def1f0a0-23d4-496c-b75f-037e2666d444\") " Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.008478 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk" (OuterVolumeSpecName: "kube-api-access-xdrfk") pod "def1f0a0-23d4-496c-b75f-037e2666d444" (UID: "def1f0a0-23d4-496c-b75f-037e2666d444"). InnerVolumeSpecName "kube-api-access-xdrfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.049321 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "def1f0a0-23d4-496c-b75f-037e2666d444" (UID: "def1f0a0-23d4-496c-b75f-037e2666d444"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.061180 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "def1f0a0-23d4-496c-b75f-037e2666d444" (UID: "def1f0a0-23d4-496c-b75f-037e2666d444"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.104530 4795 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.104575 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/def1f0a0-23d4-496c-b75f-037e2666d444-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.104591 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdrfk\" (UniqueName: \"kubernetes.io/projected/def1f0a0-23d4-496c-b75f-037e2666d444-kube-api-access-xdrfk\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.428877 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" event={"ID":"def1f0a0-23d4-496c-b75f-037e2666d444","Type":"ContainerDied","Data":"61edfe00f23c16bf0c8d0a720d0764e3c86e8a84e1e5f493bcb5ed30eb5768e3"} Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.428912 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61edfe00f23c16bf0c8d0a720d0764e3c86e8a84e1e5f493bcb5ed30eb5768e3" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.428938 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k2fn2" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.539506 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6"] Mar 10 15:38:40 crc kubenswrapper[4795]: E0310 15:38:40.539978 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def1f0a0-23d4-496c-b75f-037e2666d444" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.540005 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="def1f0a0-23d4-496c-b75f-037e2666d444" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.540322 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="def1f0a0-23d4-496c-b75f-037e2666d444" containerName="ssh-known-hosts-edpm-deployment" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.541606 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.545557 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.545655 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.545823 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.546018 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.550932 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6"] Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.717216 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.717670 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctlgh\" (UniqueName: \"kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.717882 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.819457 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.819510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctlgh\" (UniqueName: \"kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.819596 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.825232 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.825974 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.842251 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctlgh\" (UniqueName: \"kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-m6ln6\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:40 crc kubenswrapper[4795]: I0310 15:38:40.862403 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:41 crc kubenswrapper[4795]: I0310 15:38:41.458493 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6"] Mar 10 15:38:42 crc kubenswrapper[4795]: I0310 15:38:42.454658 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" event={"ID":"672817be-38e6-4a34-9ef5-5c73fed66fdb","Type":"ContainerStarted","Data":"b5bf2962c4ff8d3786a67c498a89329b1723c3643b4a0143e84ff107fe1f0d67"} Mar 10 15:38:42 crc kubenswrapper[4795]: I0310 15:38:42.454958 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" event={"ID":"672817be-38e6-4a34-9ef5-5c73fed66fdb","Type":"ContainerStarted","Data":"500c0057224ce25f065a129ee923169ddec0574c20c2606550e7a8bbeaf5c416"} Mar 10 15:38:42 crc kubenswrapper[4795]: I0310 15:38:42.490525 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" podStartSLOduration=1.992288416 podStartE2EDuration="2.490503651s" podCreationTimestamp="2026-03-10 15:38:40 +0000 UTC" firstStartedPulling="2026-03-10 15:38:41.465292271 +0000 UTC m=+1954.631033179" lastFinishedPulling="2026-03-10 15:38:41.963507516 +0000 UTC m=+1955.129248414" observedRunningTime="2026-03-10 15:38:42.477174581 +0000 UTC m=+1955.642915489" watchObservedRunningTime="2026-03-10 15:38:42.490503651 +0000 UTC m=+1955.656244549" Mar 10 15:38:44 crc kubenswrapper[4795]: I0310 15:38:44.911188 4795 scope.go:117] "RemoveContainer" containerID="eb88599d2817a644775a8873925b1e9ee2221e005905f5e309974e191e26b13c" Mar 10 15:38:44 crc kubenswrapper[4795]: I0310 15:38:44.976209 4795 scope.go:117] "RemoveContainer" containerID="313c53deb402c3cc0fa68eb9bd3743342a94414c83f34f32ac38395591172e75" Mar 10 15:38:45 crc kubenswrapper[4795]: I0310 15:38:45.017993 4795 scope.go:117] "RemoveContainer" containerID="83cd94d127b49631eeacb75b13436fbf19766ce9c3251ca5bd728af1e189b433" Mar 10 15:38:45 crc kubenswrapper[4795]: I0310 15:38:45.062283 4795 scope.go:117] "RemoveContainer" containerID="f6d987a694f05b8327d82ec0378cddffa2915c07f778d57c9fd7c571688f2765" Mar 10 15:38:49 crc kubenswrapper[4795]: I0310 15:38:49.526515 4795 generic.go:334] "Generic (PLEG): container finished" podID="672817be-38e6-4a34-9ef5-5c73fed66fdb" containerID="b5bf2962c4ff8d3786a67c498a89329b1723c3643b4a0143e84ff107fe1f0d67" exitCode=0 Mar 10 15:38:49 crc kubenswrapper[4795]: I0310 15:38:49.526615 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" event={"ID":"672817be-38e6-4a34-9ef5-5c73fed66fdb","Type":"ContainerDied","Data":"b5bf2962c4ff8d3786a67c498a89329b1723c3643b4a0143e84ff107fe1f0d67"} Mar 10 15:38:50 crc kubenswrapper[4795]: I0310 15:38:50.959675 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.128114 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctlgh\" (UniqueName: \"kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh\") pod \"672817be-38e6-4a34-9ef5-5c73fed66fdb\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.128174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory\") pod \"672817be-38e6-4a34-9ef5-5c73fed66fdb\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.128320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam\") pod \"672817be-38e6-4a34-9ef5-5c73fed66fdb\" (UID: \"672817be-38e6-4a34-9ef5-5c73fed66fdb\") " Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.133439 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh" (OuterVolumeSpecName: "kube-api-access-ctlgh") pod "672817be-38e6-4a34-9ef5-5c73fed66fdb" (UID: "672817be-38e6-4a34-9ef5-5c73fed66fdb"). InnerVolumeSpecName "kube-api-access-ctlgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.158484 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory" (OuterVolumeSpecName: "inventory") pod "672817be-38e6-4a34-9ef5-5c73fed66fdb" (UID: "672817be-38e6-4a34-9ef5-5c73fed66fdb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.159048 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "672817be-38e6-4a34-9ef5-5c73fed66fdb" (UID: "672817be-38e6-4a34-9ef5-5c73fed66fdb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.230848 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctlgh\" (UniqueName: \"kubernetes.io/projected/672817be-38e6-4a34-9ef5-5c73fed66fdb-kube-api-access-ctlgh\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.230881 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.230893 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/672817be-38e6-4a34-9ef5-5c73fed66fdb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.545387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" event={"ID":"672817be-38e6-4a34-9ef5-5c73fed66fdb","Type":"ContainerDied","Data":"500c0057224ce25f065a129ee923169ddec0574c20c2606550e7a8bbeaf5c416"} Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.545433 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="500c0057224ce25f065a129ee923169ddec0574c20c2606550e7a8bbeaf5c416" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.545437 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-m6ln6" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.616530 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx"] Mar 10 15:38:51 crc kubenswrapper[4795]: E0310 15:38:51.616996 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="672817be-38e6-4a34-9ef5-5c73fed66fdb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.617020 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="672817be-38e6-4a34-9ef5-5c73fed66fdb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.617292 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="672817be-38e6-4a34-9ef5-5c73fed66fdb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.618086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.620159 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.620536 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.620669 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.623250 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.625559 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx"] Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.638139 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txgs8\" (UniqueName: \"kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.638467 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.638615 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.739476 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txgs8\" (UniqueName: \"kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.739894 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.740020 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.746508 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.748992 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.759446 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txgs8\" (UniqueName: \"kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:51 crc kubenswrapper[4795]: I0310 15:38:51.933277 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:38:52 crc kubenswrapper[4795]: W0310 15:38:52.461653 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6fc691de_4716_4cb9_9e09_f085b6bc7625.slice/crio-ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b WatchSource:0}: Error finding container ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b: Status 404 returned error can't find the container with id ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b Mar 10 15:38:52 crc kubenswrapper[4795]: I0310 15:38:52.462293 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx"] Mar 10 15:38:52 crc kubenswrapper[4795]: I0310 15:38:52.553387 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" event={"ID":"6fc691de-4716-4cb9-9e09-f085b6bc7625","Type":"ContainerStarted","Data":"ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b"} Mar 10 15:38:53 crc kubenswrapper[4795]: I0310 15:38:53.565179 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" event={"ID":"6fc691de-4716-4cb9-9e09-f085b6bc7625","Type":"ContainerStarted","Data":"905584bae1e20f6793c8713463a83f6169a4cb0689c975cc93dffa855b17cb8a"} Mar 10 15:39:02 crc kubenswrapper[4795]: I0310 15:39:02.640650 4795 generic.go:334] "Generic (PLEG): container finished" podID="6fc691de-4716-4cb9-9e09-f085b6bc7625" containerID="905584bae1e20f6793c8713463a83f6169a4cb0689c975cc93dffa855b17cb8a" exitCode=0 Mar 10 15:39:02 crc kubenswrapper[4795]: I0310 15:39:02.640832 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" event={"ID":"6fc691de-4716-4cb9-9e09-f085b6bc7625","Type":"ContainerDied","Data":"905584bae1e20f6793c8713463a83f6169a4cb0689c975cc93dffa855b17cb8a"} Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.056670 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.096478 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory\") pod \"6fc691de-4716-4cb9-9e09-f085b6bc7625\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.096579 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txgs8\" (UniqueName: \"kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8\") pod \"6fc691de-4716-4cb9-9e09-f085b6bc7625\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.096657 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam\") pod \"6fc691de-4716-4cb9-9e09-f085b6bc7625\" (UID: \"6fc691de-4716-4cb9-9e09-f085b6bc7625\") " Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.126667 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8" (OuterVolumeSpecName: "kube-api-access-txgs8") pod "6fc691de-4716-4cb9-9e09-f085b6bc7625" (UID: "6fc691de-4716-4cb9-9e09-f085b6bc7625"). InnerVolumeSpecName "kube-api-access-txgs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.133028 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6fc691de-4716-4cb9-9e09-f085b6bc7625" (UID: "6fc691de-4716-4cb9-9e09-f085b6bc7625"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.139711 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory" (OuterVolumeSpecName: "inventory") pod "6fc691de-4716-4cb9-9e09-f085b6bc7625" (UID: "6fc691de-4716-4cb9-9e09-f085b6bc7625"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.199407 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.199702 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txgs8\" (UniqueName: \"kubernetes.io/projected/6fc691de-4716-4cb9-9e09-f085b6bc7625-kube-api-access-txgs8\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.199713 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fc691de-4716-4cb9-9e09-f085b6bc7625-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.659610 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" event={"ID":"6fc691de-4716-4cb9-9e09-f085b6bc7625","Type":"ContainerDied","Data":"ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b"} Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.659661 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed734221203e6c6576cc3b6dc53c6b25ea2aee988f90c4213b7215c8ae892a0b" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.659687 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.756713 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl"] Mar 10 15:39:04 crc kubenswrapper[4795]: E0310 15:39:04.757140 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fc691de-4716-4cb9-9e09-f085b6bc7625" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.757163 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fc691de-4716-4cb9-9e09-f085b6bc7625" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.757400 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fc691de-4716-4cb9-9e09-f085b6bc7625" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.758223 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.763748 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.764684 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.764989 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.765044 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.764988 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.765852 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.766129 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.766330 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl"] Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.766722 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809280 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809328 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809352 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809388 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809406 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btm2f\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809454 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809473 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809501 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809525 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809602 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809665 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.809714 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911299 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911347 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btm2f\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911375 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911415 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911474 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911496 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911531 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911554 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911600 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911647 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911719 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911767 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.911793 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.915996 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.916216 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.916223 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.916910 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.917011 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.917575 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.917929 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.918008 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.918239 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.918543 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.918562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.918548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.928198 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:04 crc kubenswrapper[4795]: I0310 15:39:04.935720 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btm2f\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-stfzl\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:05 crc kubenswrapper[4795]: I0310 15:39:05.075746 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:05 crc kubenswrapper[4795]: I0310 15:39:05.438466 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl"] Mar 10 15:39:05 crc kubenswrapper[4795]: I0310 15:39:05.670736 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" event={"ID":"e6878552-5eb7-470b-a482-f1be0b632858","Type":"ContainerStarted","Data":"5a5d6f034686b586e8dceb300634ce8ff04080337c57d80db42cc19c06ecd1df"} Mar 10 15:39:06 crc kubenswrapper[4795]: I0310 15:39:06.681718 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" event={"ID":"e6878552-5eb7-470b-a482-f1be0b632858","Type":"ContainerStarted","Data":"59b2e21aab4b99058c26959752f4b0e15be998322878922f833007081365a8c7"} Mar 10 15:39:06 crc kubenswrapper[4795]: I0310 15:39:06.706614 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" podStartSLOduration=2.273540385 podStartE2EDuration="2.706595612s" podCreationTimestamp="2026-03-10 15:39:04 +0000 UTC" firstStartedPulling="2026-03-10 15:39:05.460541395 +0000 UTC m=+1978.626282293" lastFinishedPulling="2026-03-10 15:39:05.893596632 +0000 UTC m=+1979.059337520" observedRunningTime="2026-03-10 15:39:06.697806781 +0000 UTC m=+1979.863547699" watchObservedRunningTime="2026-03-10 15:39:06.706595612 +0000 UTC m=+1979.872336510" Mar 10 15:39:11 crc kubenswrapper[4795]: I0310 15:39:11.035651 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnjgg"] Mar 10 15:39:11 crc kubenswrapper[4795]: I0310 15:39:11.044113 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-nnjgg"] Mar 10 15:39:11 crc kubenswrapper[4795]: I0310 15:39:11.505042 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e" path="/var/lib/kubelet/pods/01a8e8b6-aafb-43ec-b0b1-2f9e11058e2e/volumes" Mar 10 15:39:40 crc kubenswrapper[4795]: I0310 15:39:40.984702 4795 generic.go:334] "Generic (PLEG): container finished" podID="e6878552-5eb7-470b-a482-f1be0b632858" containerID="59b2e21aab4b99058c26959752f4b0e15be998322878922f833007081365a8c7" exitCode=0 Mar 10 15:39:40 crc kubenswrapper[4795]: I0310 15:39:40.984897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" event={"ID":"e6878552-5eb7-470b-a482-f1be0b632858","Type":"ContainerDied","Data":"59b2e21aab4b99058c26959752f4b0e15be998322878922f833007081365a8c7"} Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.400078 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.578366 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.578837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.578900 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579249 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btm2f\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579279 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579726 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579753 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579775 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579801 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579844 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579879 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579938 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.579985 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.580010 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle\") pod \"e6878552-5eb7-470b-a482-f1be0b632858\" (UID: \"e6878552-5eb7-470b-a482-f1be0b632858\") " Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591095 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591112 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f" (OuterVolumeSpecName: "kube-api-access-btm2f") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "kube-api-access-btm2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591272 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591336 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591348 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.591371 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.597223 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.600227 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.605652 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.606720 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.616759 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.618565 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.637535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.638724 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory" (OuterVolumeSpecName: "inventory") pod "e6878552-5eb7-470b-a482-f1be0b632858" (UID: "e6878552-5eb7-470b-a482-f1be0b632858"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682371 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682411 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682424 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682436 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682446 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682456 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682464 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682472 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682483 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682491 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btm2f\" (UniqueName: \"kubernetes.io/projected/e6878552-5eb7-470b-a482-f1be0b632858-kube-api-access-btm2f\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682499 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682507 4795 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682515 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:42 crc kubenswrapper[4795]: I0310 15:39:42.682525 4795 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6878552-5eb7-470b-a482-f1be0b632858-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.005310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" event={"ID":"e6878552-5eb7-470b-a482-f1be0b632858","Type":"ContainerDied","Data":"5a5d6f034686b586e8dceb300634ce8ff04080337c57d80db42cc19c06ecd1df"} Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.005351 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a5d6f034686b586e8dceb300634ce8ff04080337c57d80db42cc19c06ecd1df" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.005361 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-stfzl" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.095698 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6"] Mar 10 15:39:43 crc kubenswrapper[4795]: E0310 15:39:43.096366 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6878552-5eb7-470b-a482-f1be0b632858" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.096383 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6878552-5eb7-470b-a482-f1be0b632858" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.096577 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6878552-5eb7-470b-a482-f1be0b632858" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.097285 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.100221 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.100274 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.100373 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.100382 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.100335 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.110401 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6"] Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.190140 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.190235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.190277 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.190295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.190361 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.291919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.292039 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.292126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.292181 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.292206 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.293460 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.295805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.296298 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.296962 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.313562 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-pf7c6\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.427872 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:39:43 crc kubenswrapper[4795]: I0310 15:39:43.983271 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6"] Mar 10 15:39:44 crc kubenswrapper[4795]: I0310 15:39:44.029202 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" event={"ID":"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66","Type":"ContainerStarted","Data":"36187ebf76cc816c00f15cb5bc04ce16d86483ce830e4dfbc8f6231a6388efd6"} Mar 10 15:39:45 crc kubenswrapper[4795]: I0310 15:39:45.038628 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" event={"ID":"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66","Type":"ContainerStarted","Data":"f86a397cbd7fe769c416477482de1da8f073fe20cff8d95636c652a5b567cc4b"} Mar 10 15:39:45 crc kubenswrapper[4795]: I0310 15:39:45.059162 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" podStartSLOduration=1.301793282 podStartE2EDuration="2.059140815s" podCreationTimestamp="2026-03-10 15:39:43 +0000 UTC" firstStartedPulling="2026-03-10 15:39:43.994020617 +0000 UTC m=+2017.159761525" lastFinishedPulling="2026-03-10 15:39:44.75136817 +0000 UTC m=+2017.917109058" observedRunningTime="2026-03-10 15:39:45.05439667 +0000 UTC m=+2018.220137578" watchObservedRunningTime="2026-03-10 15:39:45.059140815 +0000 UTC m=+2018.224881713" Mar 10 15:39:45 crc kubenswrapper[4795]: I0310 15:39:45.193819 4795 scope.go:117] "RemoveContainer" containerID="9b6994b2c070241525e196af6e56ba0507f837ee4bf086b63a3c23e47fc38fa3" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.003901 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.006232 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.021941 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.096003 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx62j\" (UniqueName: \"kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.096235 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.096371 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.198009 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.198125 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx62j\" (UniqueName: \"kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.198217 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.198640 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.198660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.217481 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx62j\" (UniqueName: \"kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j\") pod \"certified-operators-h9gm5\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.328937 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:39:52 crc kubenswrapper[4795]: I0310 15:39:52.876421 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:39:53 crc kubenswrapper[4795]: I0310 15:39:53.110494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerStarted","Data":"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b"} Mar 10 15:39:53 crc kubenswrapper[4795]: I0310 15:39:53.110556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerStarted","Data":"8492ac493a973091a78d48e9db1194b4c60447d8fc211b1d540e6fd7de6b06f7"} Mar 10 15:39:54 crc kubenswrapper[4795]: I0310 15:39:54.123651 4795 generic.go:334] "Generic (PLEG): container finished" podID="8de14967-42da-49ed-923c-b9f83f6e3550" containerID="8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b" exitCode=0 Mar 10 15:39:54 crc kubenswrapper[4795]: I0310 15:39:54.123750 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerDied","Data":"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b"} Mar 10 15:39:56 crc kubenswrapper[4795]: I0310 15:39:56.164874 4795 generic.go:334] "Generic (PLEG): container finished" podID="8de14967-42da-49ed-923c-b9f83f6e3550" containerID="f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e" exitCode=0 Mar 10 15:39:56 crc kubenswrapper[4795]: I0310 15:39:56.165017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerDied","Data":"f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e"} Mar 10 15:39:57 crc kubenswrapper[4795]: I0310 15:39:57.174874 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerStarted","Data":"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59"} Mar 10 15:39:57 crc kubenswrapper[4795]: I0310 15:39:57.198536 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h9gm5" podStartSLOduration=3.70194826 podStartE2EDuration="6.198516849s" podCreationTimestamp="2026-03-10 15:39:51 +0000 UTC" firstStartedPulling="2026-03-10 15:39:54.12602787 +0000 UTC m=+2027.291768768" lastFinishedPulling="2026-03-10 15:39:56.622596459 +0000 UTC m=+2029.788337357" observedRunningTime="2026-03-10 15:39:57.197418677 +0000 UTC m=+2030.363159575" watchObservedRunningTime="2026-03-10 15:39:57.198516849 +0000 UTC m=+2030.364257747" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.130792 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552620-h77s6"] Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.132586 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.142275 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.142390 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.142462 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.154871 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-h77s6"] Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.261644 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8c7\" (UniqueName: \"kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7\") pod \"auto-csr-approver-29552620-h77s6\" (UID: \"d8296d71-9097-46ac-9de4-9274c0bd4a5d\") " pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.364259 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8c7\" (UniqueName: \"kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7\") pod \"auto-csr-approver-29552620-h77s6\" (UID: \"d8296d71-9097-46ac-9de4-9274c0bd4a5d\") " pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.383141 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8c7\" (UniqueName: \"kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7\") pod \"auto-csr-approver-29552620-h77s6\" (UID: \"d8296d71-9097-46ac-9de4-9274c0bd4a5d\") " pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.490210 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:00 crc kubenswrapper[4795]: I0310 15:40:00.962411 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-h77s6"] Mar 10 15:40:01 crc kubenswrapper[4795]: I0310 15:40:01.208234 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-h77s6" event={"ID":"d8296d71-9097-46ac-9de4-9274c0bd4a5d","Type":"ContainerStarted","Data":"501eba7fa32bb01862fb7da40ce6fa98aeba1b04a61c98de672c192336c0c911"} Mar 10 15:40:02 crc kubenswrapper[4795]: I0310 15:40:02.329334 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:02 crc kubenswrapper[4795]: I0310 15:40:02.330507 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:03 crc kubenswrapper[4795]: I0310 15:40:03.224863 4795 generic.go:334] "Generic (PLEG): container finished" podID="d8296d71-9097-46ac-9de4-9274c0bd4a5d" containerID="cc8a27f8f3777dba39c04267b9f2ca3458d06d7e0fd5ecaaa7d8b696d8e319c9" exitCode=0 Mar 10 15:40:03 crc kubenswrapper[4795]: I0310 15:40:03.224931 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-h77s6" event={"ID":"d8296d71-9097-46ac-9de4-9274c0bd4a5d","Type":"ContainerDied","Data":"cc8a27f8f3777dba39c04267b9f2ca3458d06d7e0fd5ecaaa7d8b696d8e319c9"} Mar 10 15:40:03 crc kubenswrapper[4795]: I0310 15:40:03.403311 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-h9gm5" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="registry-server" probeResult="failure" output=< Mar 10 15:40:03 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:40:03 crc kubenswrapper[4795]: > Mar 10 15:40:04 crc kubenswrapper[4795]: I0310 15:40:04.562808 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:04 crc kubenswrapper[4795]: I0310 15:40:04.750678 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq8c7\" (UniqueName: \"kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7\") pod \"d8296d71-9097-46ac-9de4-9274c0bd4a5d\" (UID: \"d8296d71-9097-46ac-9de4-9274c0bd4a5d\") " Mar 10 15:40:04 crc kubenswrapper[4795]: I0310 15:40:04.757351 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7" (OuterVolumeSpecName: "kube-api-access-cq8c7") pod "d8296d71-9097-46ac-9de4-9274c0bd4a5d" (UID: "d8296d71-9097-46ac-9de4-9274c0bd4a5d"). InnerVolumeSpecName "kube-api-access-cq8c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:04 crc kubenswrapper[4795]: I0310 15:40:04.853184 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq8c7\" (UniqueName: \"kubernetes.io/projected/d8296d71-9097-46ac-9de4-9274c0bd4a5d-kube-api-access-cq8c7\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:05 crc kubenswrapper[4795]: I0310 15:40:05.250137 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552620-h77s6" event={"ID":"d8296d71-9097-46ac-9de4-9274c0bd4a5d","Type":"ContainerDied","Data":"501eba7fa32bb01862fb7da40ce6fa98aeba1b04a61c98de672c192336c0c911"} Mar 10 15:40:05 crc kubenswrapper[4795]: I0310 15:40:05.250614 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="501eba7fa32bb01862fb7da40ce6fa98aeba1b04a61c98de672c192336c0c911" Mar 10 15:40:05 crc kubenswrapper[4795]: I0310 15:40:05.250238 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552620-h77s6" Mar 10 15:40:05 crc kubenswrapper[4795]: I0310 15:40:05.624581 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-lkqnr"] Mar 10 15:40:05 crc kubenswrapper[4795]: I0310 15:40:05.634224 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552614-lkqnr"] Mar 10 15:40:07 crc kubenswrapper[4795]: I0310 15:40:07.489736 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0990c949-f10b-4a1c-8bae-62f0ea1c0605" path="/var/lib/kubelet/pods/0990c949-f10b-4a1c-8bae-62f0ea1c0605/volumes" Mar 10 15:40:12 crc kubenswrapper[4795]: I0310 15:40:12.388050 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:12 crc kubenswrapper[4795]: I0310 15:40:12.442971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:12 crc kubenswrapper[4795]: I0310 15:40:12.623397 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.318181 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h9gm5" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="registry-server" containerID="cri-o://99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59" gracePeriod=2 Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.783791 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.959343 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx62j\" (UniqueName: \"kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j\") pod \"8de14967-42da-49ed-923c-b9f83f6e3550\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.959479 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content\") pod \"8de14967-42da-49ed-923c-b9f83f6e3550\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.959637 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities\") pod \"8de14967-42da-49ed-923c-b9f83f6e3550\" (UID: \"8de14967-42da-49ed-923c-b9f83f6e3550\") " Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.961090 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities" (OuterVolumeSpecName: "utilities") pod "8de14967-42da-49ed-923c-b9f83f6e3550" (UID: "8de14967-42da-49ed-923c-b9f83f6e3550"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:40:14 crc kubenswrapper[4795]: I0310 15:40:14.967413 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j" (OuterVolumeSpecName: "kube-api-access-vx62j") pod "8de14967-42da-49ed-923c-b9f83f6e3550" (UID: "8de14967-42da-49ed-923c-b9f83f6e3550"). InnerVolumeSpecName "kube-api-access-vx62j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.032815 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8de14967-42da-49ed-923c-b9f83f6e3550" (UID: "8de14967-42da-49ed-923c-b9f83f6e3550"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.062271 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx62j\" (UniqueName: \"kubernetes.io/projected/8de14967-42da-49ed-923c-b9f83f6e3550-kube-api-access-vx62j\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.062299 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.062308 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8de14967-42da-49ed-923c-b9f83f6e3550-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.329938 4795 generic.go:334] "Generic (PLEG): container finished" podID="8de14967-42da-49ed-923c-b9f83f6e3550" containerID="99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59" exitCode=0 Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.330122 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerDied","Data":"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59"} Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.330225 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h9gm5" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.330320 4795 scope.go:117] "RemoveContainer" containerID="99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.330304 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h9gm5" event={"ID":"8de14967-42da-49ed-923c-b9f83f6e3550","Type":"ContainerDied","Data":"8492ac493a973091a78d48e9db1194b4c60447d8fc211b1d540e6fd7de6b06f7"} Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.366438 4795 scope.go:117] "RemoveContainer" containerID="f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.373229 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.383400 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h9gm5"] Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.395199 4795 scope.go:117] "RemoveContainer" containerID="8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.451200 4795 scope.go:117] "RemoveContainer" containerID="99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59" Mar 10 15:40:15 crc kubenswrapper[4795]: E0310 15:40:15.451695 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59\": container with ID starting with 99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59 not found: ID does not exist" containerID="99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.451739 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59"} err="failed to get container status \"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59\": rpc error: code = NotFound desc = could not find container \"99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59\": container with ID starting with 99c57a84fb33034aa0af3e56324a983d6a581cccddc9b248e94e903168ceef59 not found: ID does not exist" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.451766 4795 scope.go:117] "RemoveContainer" containerID="f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e" Mar 10 15:40:15 crc kubenswrapper[4795]: E0310 15:40:15.452112 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e\": container with ID starting with f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e not found: ID does not exist" containerID="f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.452138 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e"} err="failed to get container status \"f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e\": rpc error: code = NotFound desc = could not find container \"f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e\": container with ID starting with f4476e6ea8789cc29641440aa735e98b398aae25c788e933aad2ee9ba4052b0e not found: ID does not exist" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.452157 4795 scope.go:117] "RemoveContainer" containerID="8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b" Mar 10 15:40:15 crc kubenswrapper[4795]: E0310 15:40:15.452406 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b\": container with ID starting with 8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b not found: ID does not exist" containerID="8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.452435 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b"} err="failed to get container status \"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b\": rpc error: code = NotFound desc = could not find container \"8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b\": container with ID starting with 8b63904b3c505b0c08e9494e42566faab897de520efd6e2deca2b91de5a44d7b not found: ID does not exist" Mar 10 15:40:15 crc kubenswrapper[4795]: I0310 15:40:15.487609 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" path="/var/lib/kubelet/pods/8de14967-42da-49ed-923c-b9f83f6e3550/volumes" Mar 10 15:40:45 crc kubenswrapper[4795]: I0310 15:40:45.279867 4795 scope.go:117] "RemoveContainer" containerID="67f41b05488a9c156c19dcb82206414d7badb3dd10ca05b347d891a12bda93a6" Mar 10 15:40:46 crc kubenswrapper[4795]: I0310 15:40:46.664823 4795 generic.go:334] "Generic (PLEG): container finished" podID="09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" containerID="f86a397cbd7fe769c416477482de1da8f073fe20cff8d95636c652a5b567cc4b" exitCode=0 Mar 10 15:40:46 crc kubenswrapper[4795]: I0310 15:40:46.664955 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" event={"ID":"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66","Type":"ContainerDied","Data":"f86a397cbd7fe769c416477482de1da8f073fe20cff8d95636c652a5b567cc4b"} Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.121556 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.252317 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0\") pod \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.252487 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam\") pod \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.252551 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle\") pod \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.252620 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory\") pod \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.252693 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts\") pod \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\" (UID: \"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66\") " Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.259627 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" (UID: "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.259637 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts" (OuterVolumeSpecName: "kube-api-access-4t5ts") pod "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" (UID: "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66"). InnerVolumeSpecName "kube-api-access-4t5ts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.294883 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" (UID: "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.301003 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" (UID: "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.316439 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory" (OuterVolumeSpecName: "inventory") pod "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" (UID: "09d32c78-bc8e-480b-b6e0-bfd1d5eedf66"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.355824 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.355867 4795 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.355880 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.355892 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5ts\" (UniqueName: \"kubernetes.io/projected/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-kube-api-access-4t5ts\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.355904 4795 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/09d32c78-bc8e-480b-b6e0-bfd1d5eedf66-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.539655 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.540011 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.686253 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" event={"ID":"09d32c78-bc8e-480b-b6e0-bfd1d5eedf66","Type":"ContainerDied","Data":"36187ebf76cc816c00f15cb5bc04ce16d86483ce830e4dfbc8f6231a6388efd6"} Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.686298 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36187ebf76cc816c00f15cb5bc04ce16d86483ce830e4dfbc8f6231a6388efd6" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.686357 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-pf7c6" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.806573 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84"] Mar 10 15:40:48 crc kubenswrapper[4795]: E0310 15:40:48.807290 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="extract-content" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807309 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="extract-content" Mar 10 15:40:48 crc kubenswrapper[4795]: E0310 15:40:48.807341 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8296d71-9097-46ac-9de4-9274c0bd4a5d" containerName="oc" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807350 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8296d71-9097-46ac-9de4-9274c0bd4a5d" containerName="oc" Mar 10 15:40:48 crc kubenswrapper[4795]: E0310 15:40:48.807385 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807393 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:48 crc kubenswrapper[4795]: E0310 15:40:48.807408 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="extract-utilities" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807414 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="extract-utilities" Mar 10 15:40:48 crc kubenswrapper[4795]: E0310 15:40:48.807426 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="registry-server" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807431 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="registry-server" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807831 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d32c78-bc8e-480b-b6e0-bfd1d5eedf66" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807868 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8de14967-42da-49ed-923c-b9f83f6e3550" containerName="registry-server" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.807903 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8296d71-9097-46ac-9de4-9274c0bd4a5d" containerName="oc" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.808759 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.812211 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.812848 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.813346 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.813667 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.813780 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.816141 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.834643 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84"] Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.865393 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.866017 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.866302 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.866434 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.866586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.866742 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x659\" (UniqueName: \"kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.968837 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.968919 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x659\" (UniqueName: \"kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.969010 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.969033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.969083 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.969170 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.977516 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.977571 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.977593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.978081 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.980859 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:48 crc kubenswrapper[4795]: I0310 15:40:48.985710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x659\" (UniqueName: \"kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:49 crc kubenswrapper[4795]: I0310 15:40:49.139099 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:40:49 crc kubenswrapper[4795]: I0310 15:40:49.682210 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84"] Mar 10 15:40:49 crc kubenswrapper[4795]: I0310 15:40:49.699870 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" event={"ID":"2cd0441d-482a-4db1-a831-4dfca1afb6f4","Type":"ContainerStarted","Data":"00d03d39ed3d556101b8349ef434fb3f3f7097236dfc91ebc1ee8897bbc8ebe8"} Mar 10 15:40:50 crc kubenswrapper[4795]: I0310 15:40:50.708337 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" event={"ID":"2cd0441d-482a-4db1-a831-4dfca1afb6f4","Type":"ContainerStarted","Data":"8f3a30e0d3318870c2039e432783bf6954f024f65366855848277cc300a33b5d"} Mar 10 15:40:50 crc kubenswrapper[4795]: I0310 15:40:50.729097 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" podStartSLOduration=2.2046802899999998 podStartE2EDuration="2.729078161s" podCreationTimestamp="2026-03-10 15:40:48 +0000 UTC" firstStartedPulling="2026-03-10 15:40:49.685022073 +0000 UTC m=+2082.850762972" lastFinishedPulling="2026-03-10 15:40:50.209419915 +0000 UTC m=+2083.375160843" observedRunningTime="2026-03-10 15:40:50.72766383 +0000 UTC m=+2083.893404728" watchObservedRunningTime="2026-03-10 15:40:50.729078161 +0000 UTC m=+2083.894819049" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.527833 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.563959 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.590157 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.595837 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.595937 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q5r4\" (UniqueName: \"kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.596135 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.697676 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.697732 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q5r4\" (UniqueName: \"kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.697789 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.698237 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.698317 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.716969 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q5r4\" (UniqueName: \"kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4\") pod \"redhat-operators-7s8zc\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:09 crc kubenswrapper[4795]: I0310 15:41:09.887342 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:10 crc kubenswrapper[4795]: I0310 15:41:10.368696 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:10 crc kubenswrapper[4795]: I0310 15:41:10.916049 4795 generic.go:334] "Generic (PLEG): container finished" podID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerID="a577256dd165f53b884aa9152bf0eb5841fbde7fabf2d33185dad5fced3b1fdc" exitCode=0 Mar 10 15:41:10 crc kubenswrapper[4795]: I0310 15:41:10.916152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerDied","Data":"a577256dd165f53b884aa9152bf0eb5841fbde7fabf2d33185dad5fced3b1fdc"} Mar 10 15:41:10 crc kubenswrapper[4795]: I0310 15:41:10.917364 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerStarted","Data":"253cff34063e510c1a6c4ea1bfecfe6dc10d4681751bd5e3949d575d3d129044"} Mar 10 15:41:12 crc kubenswrapper[4795]: I0310 15:41:12.939263 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerDied","Data":"710dfc3d6461299a5fd6d70052e4acaa2a3cb0bce286c39f4f0168c842fe0439"} Mar 10 15:41:12 crc kubenswrapper[4795]: I0310 15:41:12.940087 4795 generic.go:334] "Generic (PLEG): container finished" podID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerID="710dfc3d6461299a5fd6d70052e4acaa2a3cb0bce286c39f4f0168c842fe0439" exitCode=0 Mar 10 15:41:13 crc kubenswrapper[4795]: I0310 15:41:13.951542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerStarted","Data":"12f0dce77d0c622ea05550aea9795bca5c7446b0230030abe03ceebcbe605bd5"} Mar 10 15:41:13 crc kubenswrapper[4795]: I0310 15:41:13.974675 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7s8zc" podStartSLOduration=2.532068471 podStartE2EDuration="4.974653131s" podCreationTimestamp="2026-03-10 15:41:09 +0000 UTC" firstStartedPulling="2026-03-10 15:41:10.918029854 +0000 UTC m=+2104.083770752" lastFinishedPulling="2026-03-10 15:41:13.360614504 +0000 UTC m=+2106.526355412" observedRunningTime="2026-03-10 15:41:13.96865497 +0000 UTC m=+2107.134395868" watchObservedRunningTime="2026-03-10 15:41:13.974653131 +0000 UTC m=+2107.140394019" Mar 10 15:41:18 crc kubenswrapper[4795]: I0310 15:41:18.538757 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:41:18 crc kubenswrapper[4795]: I0310 15:41:18.539119 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.507216 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.509992 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.532906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.600274 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfk9k\" (UniqueName: \"kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.600425 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.600575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.703033 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfk9k\" (UniqueName: \"kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.703119 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.703152 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.703650 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.703678 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.722497 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfk9k\" (UniqueName: \"kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k\") pod \"redhat-marketplace-w7gsx\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.829626 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.888510 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:19 crc kubenswrapper[4795]: I0310 15:41:19.888899 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:20 crc kubenswrapper[4795]: I0310 15:41:20.289475 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:20 crc kubenswrapper[4795]: I0310 15:41:20.941198 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7s8zc" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="registry-server" probeResult="failure" output=< Mar 10 15:41:20 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:41:20 crc kubenswrapper[4795]: > Mar 10 15:41:21 crc kubenswrapper[4795]: I0310 15:41:21.017399 4795 generic.go:334] "Generic (PLEG): container finished" podID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerID="af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab" exitCode=0 Mar 10 15:41:21 crc kubenswrapper[4795]: I0310 15:41:21.017460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerDied","Data":"af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab"} Mar 10 15:41:21 crc kubenswrapper[4795]: I0310 15:41:21.017896 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerStarted","Data":"b17ecbdcda864aa65e8798eab3dc73381a442394458205e66c9980aebb85ee6f"} Mar 10 15:41:22 crc kubenswrapper[4795]: I0310 15:41:22.041291 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerStarted","Data":"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7"} Mar 10 15:41:23 crc kubenswrapper[4795]: I0310 15:41:23.054763 4795 generic.go:334] "Generic (PLEG): container finished" podID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerID="5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7" exitCode=0 Mar 10 15:41:23 crc kubenswrapper[4795]: I0310 15:41:23.054811 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerDied","Data":"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7"} Mar 10 15:41:24 crc kubenswrapper[4795]: I0310 15:41:24.064809 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerStarted","Data":"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8"} Mar 10 15:41:24 crc kubenswrapper[4795]: I0310 15:41:24.082819 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7gsx" podStartSLOduration=2.285674775 podStartE2EDuration="5.082794723s" podCreationTimestamp="2026-03-10 15:41:19 +0000 UTC" firstStartedPulling="2026-03-10 15:41:21.019936549 +0000 UTC m=+2114.185677457" lastFinishedPulling="2026-03-10 15:41:23.817056497 +0000 UTC m=+2116.982797405" observedRunningTime="2026-03-10 15:41:24.081220288 +0000 UTC m=+2117.246961186" watchObservedRunningTime="2026-03-10 15:41:24.082794723 +0000 UTC m=+2117.248535621" Mar 10 15:41:29 crc kubenswrapper[4795]: I0310 15:41:29.830352 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:29 crc kubenswrapper[4795]: I0310 15:41:29.831011 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:29 crc kubenswrapper[4795]: I0310 15:41:29.882291 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:29 crc kubenswrapper[4795]: I0310 15:41:29.943441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:29 crc kubenswrapper[4795]: I0310 15:41:29.993845 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:30 crc kubenswrapper[4795]: I0310 15:41:30.179813 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:31 crc kubenswrapper[4795]: I0310 15:41:31.934141 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:31 crc kubenswrapper[4795]: I0310 15:41:31.934647 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7s8zc" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="registry-server" containerID="cri-o://12f0dce77d0c622ea05550aea9795bca5c7446b0230030abe03ceebcbe605bd5" gracePeriod=2 Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.181488 4795 generic.go:334] "Generic (PLEG): container finished" podID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerID="12f0dce77d0c622ea05550aea9795bca5c7446b0230030abe03ceebcbe605bd5" exitCode=0 Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.181542 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerDied","Data":"12f0dce77d0c622ea05550aea9795bca5c7446b0230030abe03ceebcbe605bd5"} Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.438342 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.464165 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities\") pod \"23ca904c-719e-4a83-a94f-4b4515e5e481\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.464228 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q5r4\" (UniqueName: \"kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4\") pod \"23ca904c-719e-4a83-a94f-4b4515e5e481\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.464341 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content\") pod \"23ca904c-719e-4a83-a94f-4b4515e5e481\" (UID: \"23ca904c-719e-4a83-a94f-4b4515e5e481\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.464912 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities" (OuterVolumeSpecName: "utilities") pod "23ca904c-719e-4a83-a94f-4b4515e5e481" (UID: "23ca904c-719e-4a83-a94f-4b4515e5e481"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.471347 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4" (OuterVolumeSpecName: "kube-api-access-8q5r4") pod "23ca904c-719e-4a83-a94f-4b4515e5e481" (UID: "23ca904c-719e-4a83-a94f-4b4515e5e481"). InnerVolumeSpecName "kube-api-access-8q5r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.533384 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.534174 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7gsx" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="registry-server" containerID="cri-o://ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8" gracePeriod=2 Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.567671 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.567701 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q5r4\" (UniqueName: \"kubernetes.io/projected/23ca904c-719e-4a83-a94f-4b4515e5e481-kube-api-access-8q5r4\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.585080 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23ca904c-719e-4a83-a94f-4b4515e5e481" (UID: "23ca904c-719e-4a83-a94f-4b4515e5e481"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.674043 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23ca904c-719e-4a83-a94f-4b4515e5e481-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.907377 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.978376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfk9k\" (UniqueName: \"kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k\") pod \"38b41993-8a4f-4889-a39a-cdb5d1151df6\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.978447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities\") pod \"38b41993-8a4f-4889-a39a-cdb5d1151df6\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.978546 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content\") pod \"38b41993-8a4f-4889-a39a-cdb5d1151df6\" (UID: \"38b41993-8a4f-4889-a39a-cdb5d1151df6\") " Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.978934 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities" (OuterVolumeSpecName: "utilities") pod "38b41993-8a4f-4889-a39a-cdb5d1151df6" (UID: "38b41993-8a4f-4889-a39a-cdb5d1151df6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.979112 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:32 crc kubenswrapper[4795]: I0310 15:41:32.984851 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k" (OuterVolumeSpecName: "kube-api-access-jfk9k") pod "38b41993-8a4f-4889-a39a-cdb5d1151df6" (UID: "38b41993-8a4f-4889-a39a-cdb5d1151df6"). InnerVolumeSpecName "kube-api-access-jfk9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.004900 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38b41993-8a4f-4889-a39a-cdb5d1151df6" (UID: "38b41993-8a4f-4889-a39a-cdb5d1151df6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.080813 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfk9k\" (UniqueName: \"kubernetes.io/projected/38b41993-8a4f-4889-a39a-cdb5d1151df6-kube-api-access-jfk9k\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.081172 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38b41993-8a4f-4889-a39a-cdb5d1151df6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.192577 4795 generic.go:334] "Generic (PLEG): container finished" podID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerID="ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8" exitCode=0 Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.192639 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7gsx" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.192683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerDied","Data":"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8"} Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.192765 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7gsx" event={"ID":"38b41993-8a4f-4889-a39a-cdb5d1151df6","Type":"ContainerDied","Data":"b17ecbdcda864aa65e8798eab3dc73381a442394458205e66c9980aebb85ee6f"} Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.192806 4795 scope.go:117] "RemoveContainer" containerID="ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.195087 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7s8zc" event={"ID":"23ca904c-719e-4a83-a94f-4b4515e5e481","Type":"ContainerDied","Data":"253cff34063e510c1a6c4ea1bfecfe6dc10d4681751bd5e3949d575d3d129044"} Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.195172 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7s8zc" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.216576 4795 scope.go:117] "RemoveContainer" containerID="5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.240620 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.241052 4795 scope.go:117] "RemoveContainer" containerID="af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.249762 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7gsx"] Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.257421 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.264635 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7s8zc"] Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.281180 4795 scope.go:117] "RemoveContainer" containerID="ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8" Mar 10 15:41:33 crc kubenswrapper[4795]: E0310 15:41:33.281640 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8\": container with ID starting with ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8 not found: ID does not exist" containerID="ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.281678 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8"} err="failed to get container status \"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8\": rpc error: code = NotFound desc = could not find container \"ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8\": container with ID starting with ee6936e6f621107cd624112e25c0269394b6ba4c25e7dfbbe54c67153bb5b8a8 not found: ID does not exist" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.281701 4795 scope.go:117] "RemoveContainer" containerID="5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7" Mar 10 15:41:33 crc kubenswrapper[4795]: E0310 15:41:33.282094 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7\": container with ID starting with 5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7 not found: ID does not exist" containerID="5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.282126 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7"} err="failed to get container status \"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7\": rpc error: code = NotFound desc = could not find container \"5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7\": container with ID starting with 5554de78a22cf79e979acae50cf021d3daae6691c05221b23524934734cc9cf7 not found: ID does not exist" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.282145 4795 scope.go:117] "RemoveContainer" containerID="af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab" Mar 10 15:41:33 crc kubenswrapper[4795]: E0310 15:41:33.282361 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab\": container with ID starting with af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab not found: ID does not exist" containerID="af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.282388 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab"} err="failed to get container status \"af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab\": rpc error: code = NotFound desc = could not find container \"af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab\": container with ID starting with af926db21d1b665e49344d1a468068a0e54f6bf50632356136f4c2f29a9f34ab not found: ID does not exist" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.282406 4795 scope.go:117] "RemoveContainer" containerID="12f0dce77d0c622ea05550aea9795bca5c7446b0230030abe03ceebcbe605bd5" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.322743 4795 scope.go:117] "RemoveContainer" containerID="710dfc3d6461299a5fd6d70052e4acaa2a3cb0bce286c39f4f0168c842fe0439" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.344127 4795 scope.go:117] "RemoveContainer" containerID="a577256dd165f53b884aa9152bf0eb5841fbde7fabf2d33185dad5fced3b1fdc" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.503707 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" path="/var/lib/kubelet/pods/23ca904c-719e-4a83-a94f-4b4515e5e481/volumes" Mar 10 15:41:33 crc kubenswrapper[4795]: I0310 15:41:33.506271 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" path="/var/lib/kubelet/pods/38b41993-8a4f-4889-a39a-cdb5d1151df6/volumes" Mar 10 15:41:35 crc kubenswrapper[4795]: I0310 15:41:35.219429 4795 generic.go:334] "Generic (PLEG): container finished" podID="2cd0441d-482a-4db1-a831-4dfca1afb6f4" containerID="8f3a30e0d3318870c2039e432783bf6954f024f65366855848277cc300a33b5d" exitCode=0 Mar 10 15:41:35 crc kubenswrapper[4795]: I0310 15:41:35.219481 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" event={"ID":"2cd0441d-482a-4db1-a831-4dfca1afb6f4","Type":"ContainerDied","Data":"8f3a30e0d3318870c2039e432783bf6954f024f65366855848277cc300a33b5d"} Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.645873 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761360 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761447 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761590 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761634 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x659\" (UniqueName: \"kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.761736 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory\") pod \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\" (UID: \"2cd0441d-482a-4db1-a831-4dfca1afb6f4\") " Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.766860 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.768825 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659" (OuterVolumeSpecName: "kube-api-access-6x659") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "kube-api-access-6x659". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.794330 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.797310 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory" (OuterVolumeSpecName: "inventory") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.799483 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.803183 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "2cd0441d-482a-4db1-a831-4dfca1afb6f4" (UID: "2cd0441d-482a-4db1-a831-4dfca1afb6f4"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865340 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865392 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865412 4795 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865427 4795 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865443 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x659\" (UniqueName: \"kubernetes.io/projected/2cd0441d-482a-4db1-a831-4dfca1afb6f4-kube-api-access-6x659\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:36 crc kubenswrapper[4795]: I0310 15:41:36.865459 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cd0441d-482a-4db1-a831-4dfca1afb6f4-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.240957 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" event={"ID":"2cd0441d-482a-4db1-a831-4dfca1afb6f4","Type":"ContainerDied","Data":"00d03d39ed3d556101b8349ef434fb3f3f7097236dfc91ebc1ee8897bbc8ebe8"} Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.241001 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00d03d39ed3d556101b8349ef434fb3f3f7097236dfc91ebc1ee8897bbc8ebe8" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.241104 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.326439 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5"] Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.326951 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="extract-utilities" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.326973 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="extract-utilities" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327001 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="extract-content" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327009 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="extract-content" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327027 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327035 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="extract-utilities" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327057 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="extract-utilities" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327095 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd0441d-482a-4db1-a831-4dfca1afb6f4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327105 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd0441d-482a-4db1-a831-4dfca1afb6f4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327115 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327125 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: E0310 15:41:37.327144 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="extract-content" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327152 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="extract-content" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327359 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="23ca904c-719e-4a83-a94f-4b4515e5e481" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327389 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="38b41993-8a4f-4889-a39a-cdb5d1151df6" containerName="registry-server" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.327403 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd0441d-482a-4db1-a831-4dfca1afb6f4" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.328163 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.331052 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.331594 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.331641 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.332420 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.332740 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.355587 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5"] Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.483489 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29pjl\" (UniqueName: \"kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.483586 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.483653 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.483703 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.483737 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.585914 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.586021 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.586094 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.586130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.586192 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29pjl\" (UniqueName: \"kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.593027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.593228 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.594099 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.597954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.611110 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29pjl\" (UniqueName: \"kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:37 crc kubenswrapper[4795]: I0310 15:41:37.653863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:41:38 crc kubenswrapper[4795]: I0310 15:41:38.004292 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5"] Mar 10 15:41:38 crc kubenswrapper[4795]: I0310 15:41:38.253389 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" event={"ID":"e98c0b12-b750-4125-b3c6-170a91a0aa0e","Type":"ContainerStarted","Data":"a22a3d81c16035a248ec5b4d8ca366ba4247df2aedb9293ee7e490944650f961"} Mar 10 15:41:39 crc kubenswrapper[4795]: I0310 15:41:39.263901 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" event={"ID":"e98c0b12-b750-4125-b3c6-170a91a0aa0e","Type":"ContainerStarted","Data":"249e1f7a36e9630d6ff6760ccfaa6e15ae57b715be807cb9d1137163553fe22e"} Mar 10 15:41:39 crc kubenswrapper[4795]: I0310 15:41:39.294985 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" podStartSLOduration=1.8825543580000001 podStartE2EDuration="2.294961475s" podCreationTimestamp="2026-03-10 15:41:37 +0000 UTC" firstStartedPulling="2026-03-10 15:41:37.991253436 +0000 UTC m=+2131.156994334" lastFinishedPulling="2026-03-10 15:41:38.403660553 +0000 UTC m=+2131.569401451" observedRunningTime="2026-03-10 15:41:39.27971121 +0000 UTC m=+2132.445452178" watchObservedRunningTime="2026-03-10 15:41:39.294961475 +0000 UTC m=+2132.460702383" Mar 10 15:41:48 crc kubenswrapper[4795]: I0310 15:41:48.539004 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:41:48 crc kubenswrapper[4795]: I0310 15:41:48.539691 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:41:48 crc kubenswrapper[4795]: I0310 15:41:48.539755 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:41:48 crc kubenswrapper[4795]: I0310 15:41:48.540927 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:41:48 crc kubenswrapper[4795]: I0310 15:41:48.541023 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5" gracePeriod=600 Mar 10 15:41:49 crc kubenswrapper[4795]: I0310 15:41:49.358454 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5" exitCode=0 Mar 10 15:41:49 crc kubenswrapper[4795]: I0310 15:41:49.358531 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5"} Mar 10 15:41:49 crc kubenswrapper[4795]: I0310 15:41:49.359106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917"} Mar 10 15:41:49 crc kubenswrapper[4795]: I0310 15:41:49.359143 4795 scope.go:117] "RemoveContainer" containerID="2238aefe82c5b3c186803110189ee27d742e624bb294dd8f1309169e4a5ccf64" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.148702 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552622-mvzps"] Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.151004 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.154014 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.154292 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.154388 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.161953 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-mvzps"] Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.264504 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9bkp\" (UniqueName: \"kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp\") pod \"auto-csr-approver-29552622-mvzps\" (UID: \"93a6c803-148d-4abd-bca2-dab66a3a155e\") " pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.367466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9bkp\" (UniqueName: \"kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp\") pod \"auto-csr-approver-29552622-mvzps\" (UID: \"93a6c803-148d-4abd-bca2-dab66a3a155e\") " pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.397014 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9bkp\" (UniqueName: \"kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp\") pod \"auto-csr-approver-29552622-mvzps\" (UID: \"93a6c803-148d-4abd-bca2-dab66a3a155e\") " pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.478840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.976267 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-mvzps"] Mar 10 15:42:00 crc kubenswrapper[4795]: W0310 15:42:00.986970 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93a6c803_148d_4abd_bca2_dab66a3a155e.slice/crio-3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513 WatchSource:0}: Error finding container 3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513: Status 404 returned error can't find the container with id 3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513 Mar 10 15:42:00 crc kubenswrapper[4795]: I0310 15:42:00.990505 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:42:01 crc kubenswrapper[4795]: I0310 15:42:01.494826 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-mvzps" event={"ID":"93a6c803-148d-4abd-bca2-dab66a3a155e","Type":"ContainerStarted","Data":"3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513"} Mar 10 15:42:02 crc kubenswrapper[4795]: I0310 15:42:02.514451 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-mvzps" event={"ID":"93a6c803-148d-4abd-bca2-dab66a3a155e","Type":"ContainerStarted","Data":"ecc897a71e7c1e253a2dfa4e270f3a556d93f3fccfebda126b74442882d9ceba"} Mar 10 15:42:02 crc kubenswrapper[4795]: I0310 15:42:02.536787 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552622-mvzps" podStartSLOduration=1.440473085 podStartE2EDuration="2.536770262s" podCreationTimestamp="2026-03-10 15:42:00 +0000 UTC" firstStartedPulling="2026-03-10 15:42:00.990256569 +0000 UTC m=+2154.155997467" lastFinishedPulling="2026-03-10 15:42:02.086553746 +0000 UTC m=+2155.252294644" observedRunningTime="2026-03-10 15:42:02.530665938 +0000 UTC m=+2155.696406836" watchObservedRunningTime="2026-03-10 15:42:02.536770262 +0000 UTC m=+2155.702511160" Mar 10 15:42:03 crc kubenswrapper[4795]: I0310 15:42:03.529114 4795 generic.go:334] "Generic (PLEG): container finished" podID="93a6c803-148d-4abd-bca2-dab66a3a155e" containerID="ecc897a71e7c1e253a2dfa4e270f3a556d93f3fccfebda126b74442882d9ceba" exitCode=0 Mar 10 15:42:03 crc kubenswrapper[4795]: I0310 15:42:03.529197 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-mvzps" event={"ID":"93a6c803-148d-4abd-bca2-dab66a3a155e","Type":"ContainerDied","Data":"ecc897a71e7c1e253a2dfa4e270f3a556d93f3fccfebda126b74442882d9ceba"} Mar 10 15:42:04 crc kubenswrapper[4795]: I0310 15:42:04.863649 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:04 crc kubenswrapper[4795]: I0310 15:42:04.949688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9bkp\" (UniqueName: \"kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp\") pod \"93a6c803-148d-4abd-bca2-dab66a3a155e\" (UID: \"93a6c803-148d-4abd-bca2-dab66a3a155e\") " Mar 10 15:42:04 crc kubenswrapper[4795]: I0310 15:42:04.956491 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp" (OuterVolumeSpecName: "kube-api-access-z9bkp") pod "93a6c803-148d-4abd-bca2-dab66a3a155e" (UID: "93a6c803-148d-4abd-bca2-dab66a3a155e"). InnerVolumeSpecName "kube-api-access-z9bkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.051816 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9bkp\" (UniqueName: \"kubernetes.io/projected/93a6c803-148d-4abd-bca2-dab66a3a155e-kube-api-access-z9bkp\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.545264 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552622-mvzps" event={"ID":"93a6c803-148d-4abd-bca2-dab66a3a155e","Type":"ContainerDied","Data":"3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513"} Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.545304 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3650569537c66b94bbb19fd4eddb353d3bcb9b3abe46843549c858e712e7b513" Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.545337 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552622-mvzps" Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.597189 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-2sphr"] Mar 10 15:42:05 crc kubenswrapper[4795]: I0310 15:42:05.604174 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552616-2sphr"] Mar 10 15:42:07 crc kubenswrapper[4795]: I0310 15:42:07.510227 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91365c19-0f5e-41c5-b530-5aa4a7062f02" path="/var/lib/kubelet/pods/91365c19-0f5e-41c5-b530-5aa4a7062f02/volumes" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.515631 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:11 crc kubenswrapper[4795]: E0310 15:42:11.516720 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93a6c803-148d-4abd-bca2-dab66a3a155e" containerName="oc" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.516740 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="93a6c803-148d-4abd-bca2-dab66a3a155e" containerName="oc" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.516952 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="93a6c803-148d-4abd-bca2-dab66a3a155e" containerName="oc" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.518458 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.539092 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.680948 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.681019 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqvc\" (UniqueName: \"kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.681085 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.783264 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.783320 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqvc\" (UniqueName: \"kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.783355 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.784122 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.784191 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.803848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqvc\" (UniqueName: \"kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc\") pod \"community-operators-7qz89\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:11 crc kubenswrapper[4795]: I0310 15:42:11.859193 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:12 crc kubenswrapper[4795]: W0310 15:42:12.399247 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe184a33_3705_4f4e_83eb_b3cfb84bac90.slice/crio-8c2fcff31a8a39e3005ca2980537c691afa47786dc46fb4219020b676e0fa18c WatchSource:0}: Error finding container 8c2fcff31a8a39e3005ca2980537c691afa47786dc46fb4219020b676e0fa18c: Status 404 returned error can't find the container with id 8c2fcff31a8a39e3005ca2980537c691afa47786dc46fb4219020b676e0fa18c Mar 10 15:42:12 crc kubenswrapper[4795]: I0310 15:42:12.402507 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:12 crc kubenswrapper[4795]: I0310 15:42:12.628683 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerStarted","Data":"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a"} Mar 10 15:42:12 crc kubenswrapper[4795]: I0310 15:42:12.629054 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerStarted","Data":"8c2fcff31a8a39e3005ca2980537c691afa47786dc46fb4219020b676e0fa18c"} Mar 10 15:42:13 crc kubenswrapper[4795]: I0310 15:42:13.636918 4795 generic.go:334] "Generic (PLEG): container finished" podID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerID="5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a" exitCode=0 Mar 10 15:42:13 crc kubenswrapper[4795]: I0310 15:42:13.637015 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerDied","Data":"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a"} Mar 10 15:42:14 crc kubenswrapper[4795]: I0310 15:42:14.647281 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerStarted","Data":"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f"} Mar 10 15:42:15 crc kubenswrapper[4795]: I0310 15:42:15.661928 4795 generic.go:334] "Generic (PLEG): container finished" podID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerID="e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f" exitCode=0 Mar 10 15:42:15 crc kubenswrapper[4795]: I0310 15:42:15.662056 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerDied","Data":"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f"} Mar 10 15:42:16 crc kubenswrapper[4795]: I0310 15:42:16.672780 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerStarted","Data":"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d"} Mar 10 15:42:16 crc kubenswrapper[4795]: I0310 15:42:16.702623 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7qz89" podStartSLOduration=3.023969642 podStartE2EDuration="5.702602212s" podCreationTimestamp="2026-03-10 15:42:11 +0000 UTC" firstStartedPulling="2026-03-10 15:42:13.638647726 +0000 UTC m=+2166.804388624" lastFinishedPulling="2026-03-10 15:42:16.317280256 +0000 UTC m=+2169.483021194" observedRunningTime="2026-03-10 15:42:16.692481824 +0000 UTC m=+2169.858222742" watchObservedRunningTime="2026-03-10 15:42:16.702602212 +0000 UTC m=+2169.868343130" Mar 10 15:42:21 crc kubenswrapper[4795]: I0310 15:42:21.859971 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:21 crc kubenswrapper[4795]: I0310 15:42:21.860821 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:21 crc kubenswrapper[4795]: I0310 15:42:21.934682 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:22 crc kubenswrapper[4795]: I0310 15:42:22.803942 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:22 crc kubenswrapper[4795]: I0310 15:42:22.851341 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:24 crc kubenswrapper[4795]: I0310 15:42:24.759569 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7qz89" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="registry-server" containerID="cri-o://3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d" gracePeriod=2 Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.598256 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.674448 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content\") pod \"be184a33-3705-4f4e-83eb-b3cfb84bac90\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.674583 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities\") pod \"be184a33-3705-4f4e-83eb-b3cfb84bac90\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.674741 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsqvc\" (UniqueName: \"kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc\") pod \"be184a33-3705-4f4e-83eb-b3cfb84bac90\" (UID: \"be184a33-3705-4f4e-83eb-b3cfb84bac90\") " Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.676595 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities" (OuterVolumeSpecName: "utilities") pod "be184a33-3705-4f4e-83eb-b3cfb84bac90" (UID: "be184a33-3705-4f4e-83eb-b3cfb84bac90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.681924 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc" (OuterVolumeSpecName: "kube-api-access-rsqvc") pod "be184a33-3705-4f4e-83eb-b3cfb84bac90" (UID: "be184a33-3705-4f4e-83eb-b3cfb84bac90"). InnerVolumeSpecName "kube-api-access-rsqvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.724502 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be184a33-3705-4f4e-83eb-b3cfb84bac90" (UID: "be184a33-3705-4f4e-83eb-b3cfb84bac90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.770121 4795 generic.go:334] "Generic (PLEG): container finished" podID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerID="3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d" exitCode=0 Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.770185 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7qz89" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.770193 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerDied","Data":"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d"} Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.770261 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7qz89" event={"ID":"be184a33-3705-4f4e-83eb-b3cfb84bac90","Type":"ContainerDied","Data":"8c2fcff31a8a39e3005ca2980537c691afa47786dc46fb4219020b676e0fa18c"} Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.770285 4795 scope.go:117] "RemoveContainer" containerID="3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.791534 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsqvc\" (UniqueName: \"kubernetes.io/projected/be184a33-3705-4f4e-83eb-b3cfb84bac90-kube-api-access-rsqvc\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.791576 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.791602 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be184a33-3705-4f4e-83eb-b3cfb84bac90-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.797890 4795 scope.go:117] "RemoveContainer" containerID="e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.825213 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.835290 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7qz89"] Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.839646 4795 scope.go:117] "RemoveContainer" containerID="5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.872263 4795 scope.go:117] "RemoveContainer" containerID="3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d" Mar 10 15:42:25 crc kubenswrapper[4795]: E0310 15:42:25.872696 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d\": container with ID starting with 3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d not found: ID does not exist" containerID="3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.872726 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d"} err="failed to get container status \"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d\": rpc error: code = NotFound desc = could not find container \"3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d\": container with ID starting with 3efeb4ce4c00ce8897962a31efae54388eb432c65aff733d80bdcffce87f672d not found: ID does not exist" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.872748 4795 scope.go:117] "RemoveContainer" containerID="e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f" Mar 10 15:42:25 crc kubenswrapper[4795]: E0310 15:42:25.873205 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f\": container with ID starting with e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f not found: ID does not exist" containerID="e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.873316 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f"} err="failed to get container status \"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f\": rpc error: code = NotFound desc = could not find container \"e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f\": container with ID starting with e4a44562ea28bbe86f86330f1dc945c07b9bc3f002a08072e171ecaaed3fe87f not found: ID does not exist" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.873399 4795 scope.go:117] "RemoveContainer" containerID="5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a" Mar 10 15:42:25 crc kubenswrapper[4795]: E0310 15:42:25.873733 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a\": container with ID starting with 5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a not found: ID does not exist" containerID="5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a" Mar 10 15:42:25 crc kubenswrapper[4795]: I0310 15:42:25.873760 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a"} err="failed to get container status \"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a\": rpc error: code = NotFound desc = could not find container \"5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a\": container with ID starting with 5a42bae2807f2b7bdcb2b2223260ed2e65a8b81080b0fab93a05182d6fe8264a not found: ID does not exist" Mar 10 15:42:27 crc kubenswrapper[4795]: I0310 15:42:27.492544 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" path="/var/lib/kubelet/pods/be184a33-3705-4f4e-83eb-b3cfb84bac90/volumes" Mar 10 15:42:45 crc kubenswrapper[4795]: I0310 15:42:45.454824 4795 scope.go:117] "RemoveContainer" containerID="fb1285a2dcc580e8d90d3f4ddc8310487a11a86c3679d12cb616b1ce3b40fa01" Mar 10 15:43:48 crc kubenswrapper[4795]: I0310 15:43:48.538834 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:43:48 crc kubenswrapper[4795]: I0310 15:43:48.539391 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.146454 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552624-qdjg5"] Mar 10 15:44:00 crc kubenswrapper[4795]: E0310 15:44:00.147267 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.147282 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4795]: E0310 15:44:00.147298 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="extract-content" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.147304 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="extract-content" Mar 10 15:44:00 crc kubenswrapper[4795]: E0310 15:44:00.147327 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="extract-utilities" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.147333 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="extract-utilities" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.147507 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="be184a33-3705-4f4e-83eb-b3cfb84bac90" containerName="registry-server" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.148228 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.150243 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.150378 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.157832 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.160508 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-qdjg5"] Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.334377 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psvsc\" (UniqueName: \"kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc\") pod \"auto-csr-approver-29552624-qdjg5\" (UID: \"6931d53a-1555-45b7-9191-978aaf51871f\") " pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.436312 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psvsc\" (UniqueName: \"kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc\") pod \"auto-csr-approver-29552624-qdjg5\" (UID: \"6931d53a-1555-45b7-9191-978aaf51871f\") " pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.461599 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psvsc\" (UniqueName: \"kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc\") pod \"auto-csr-approver-29552624-qdjg5\" (UID: \"6931d53a-1555-45b7-9191-978aaf51871f\") " pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.465980 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:00 crc kubenswrapper[4795]: I0310 15:44:00.877718 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-qdjg5"] Mar 10 15:44:01 crc kubenswrapper[4795]: I0310 15:44:01.770544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" event={"ID":"6931d53a-1555-45b7-9191-978aaf51871f","Type":"ContainerStarted","Data":"f0eac8ce615ccfebfebd63bd73715f8f623bc2185e17989c3d4c250cc194bb31"} Mar 10 15:44:03 crc kubenswrapper[4795]: I0310 15:44:03.790277 4795 generic.go:334] "Generic (PLEG): container finished" podID="6931d53a-1555-45b7-9191-978aaf51871f" containerID="0286b35b4cf1b65b8a1593c79b84f1086ba4c73235386ef1aca3ee6c6289d210" exitCode=0 Mar 10 15:44:03 crc kubenswrapper[4795]: I0310 15:44:03.790344 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" event={"ID":"6931d53a-1555-45b7-9191-978aaf51871f","Type":"ContainerDied","Data":"0286b35b4cf1b65b8a1593c79b84f1086ba4c73235386ef1aca3ee6c6289d210"} Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.431605 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.619496 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psvsc\" (UniqueName: \"kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc\") pod \"6931d53a-1555-45b7-9191-978aaf51871f\" (UID: \"6931d53a-1555-45b7-9191-978aaf51871f\") " Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.639512 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc" (OuterVolumeSpecName: "kube-api-access-psvsc") pod "6931d53a-1555-45b7-9191-978aaf51871f" (UID: "6931d53a-1555-45b7-9191-978aaf51871f"). InnerVolumeSpecName "kube-api-access-psvsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.721472 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psvsc\" (UniqueName: \"kubernetes.io/projected/6931d53a-1555-45b7-9191-978aaf51871f-kube-api-access-psvsc\") on node \"crc\" DevicePath \"\"" Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.814438 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" event={"ID":"6931d53a-1555-45b7-9191-978aaf51871f","Type":"ContainerDied","Data":"f0eac8ce615ccfebfebd63bd73715f8f623bc2185e17989c3d4c250cc194bb31"} Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.814486 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552624-qdjg5" Mar 10 15:44:05 crc kubenswrapper[4795]: I0310 15:44:05.814499 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0eac8ce615ccfebfebd63bd73715f8f623bc2185e17989c3d4c250cc194bb31" Mar 10 15:44:08 crc kubenswrapper[4795]: I0310 15:44:08.453007 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-2s9m9"] Mar 10 15:44:08 crc kubenswrapper[4795]: I0310 15:44:08.468697 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552618-2s9m9"] Mar 10 15:44:09 crc kubenswrapper[4795]: I0310 15:44:09.629916 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794fab49-0b94-4e9d-b291-ae2a4654f7fe" path="/var/lib/kubelet/pods/794fab49-0b94-4e9d-b291-ae2a4654f7fe/volumes" Mar 10 15:44:18 crc kubenswrapper[4795]: I0310 15:44:18.539027 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:44:18 crc kubenswrapper[4795]: I0310 15:44:18.539588 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:44:45 crc kubenswrapper[4795]: I0310 15:44:45.619974 4795 scope.go:117] "RemoveContainer" containerID="26140592fc8d7e2848b169053a3d627cb2a36f7dee16b177256cc7b0cb73736b" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.539736 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.540326 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.540402 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.541186 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.541244 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" gracePeriod=600 Mar 10 15:44:48 crc kubenswrapper[4795]: E0310 15:44:48.663785 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.998509 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" exitCode=0 Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.998578 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917"} Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.998864 4795 scope.go:117] "RemoveContainer" containerID="0ef273750e0cbde25fb287f8a8956b379d7b9f04565ff3bcef74366fd9990cc5" Mar 10 15:44:48 crc kubenswrapper[4795]: I0310 15:44:48.999453 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:44:48 crc kubenswrapper[4795]: E0310 15:44:48.999846 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.160719 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff"] Mar 10 15:45:00 crc kubenswrapper[4795]: E0310 15:45:00.161819 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6931d53a-1555-45b7-9191-978aaf51871f" containerName="oc" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.161839 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="6931d53a-1555-45b7-9191-978aaf51871f" containerName="oc" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.162092 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="6931d53a-1555-45b7-9191-978aaf51871f" containerName="oc" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.162936 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.165224 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.170813 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.174746 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff"] Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.332843 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9l4r\" (UniqueName: \"kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.333028 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.333120 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.434952 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.435053 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.435112 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9l4r\" (UniqueName: \"kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.436211 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.440866 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.450897 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9l4r\" (UniqueName: \"kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r\") pod \"collect-profiles-29552625-lr2ff\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.487248 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:00 crc kubenswrapper[4795]: I0310 15:45:00.969841 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff"] Mar 10 15:45:01 crc kubenswrapper[4795]: I0310 15:45:01.118181 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" event={"ID":"969e70e9-84f5-46fd-814f-4fc7c06a6394","Type":"ContainerStarted","Data":"d1ed2e4127d5cfce29becf3289495e9c0051cc6ed5537f1cb8e405003337f7a1"} Mar 10 15:45:01 crc kubenswrapper[4795]: I0310 15:45:01.476925 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:45:01 crc kubenswrapper[4795]: E0310 15:45:01.477728 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:45:02 crc kubenswrapper[4795]: I0310 15:45:02.130844 4795 generic.go:334] "Generic (PLEG): container finished" podID="969e70e9-84f5-46fd-814f-4fc7c06a6394" containerID="b679939b798b740a2e48128a86b47362afa835fe9c4b880801fa0e2044fa8535" exitCode=0 Mar 10 15:45:02 crc kubenswrapper[4795]: I0310 15:45:02.130895 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" event={"ID":"969e70e9-84f5-46fd-814f-4fc7c06a6394","Type":"ContainerDied","Data":"b679939b798b740a2e48128a86b47362afa835fe9c4b880801fa0e2044fa8535"} Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.478217 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.599856 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume\") pod \"969e70e9-84f5-46fd-814f-4fc7c06a6394\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.600292 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume\") pod \"969e70e9-84f5-46fd-814f-4fc7c06a6394\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.600492 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9l4r\" (UniqueName: \"kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r\") pod \"969e70e9-84f5-46fd-814f-4fc7c06a6394\" (UID: \"969e70e9-84f5-46fd-814f-4fc7c06a6394\") " Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.600899 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume" (OuterVolumeSpecName: "config-volume") pod "969e70e9-84f5-46fd-814f-4fc7c06a6394" (UID: "969e70e9-84f5-46fd-814f-4fc7c06a6394"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.618451 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "969e70e9-84f5-46fd-814f-4fc7c06a6394" (UID: "969e70e9-84f5-46fd-814f-4fc7c06a6394"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.618477 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r" (OuterVolumeSpecName: "kube-api-access-r9l4r") pod "969e70e9-84f5-46fd-814f-4fc7c06a6394" (UID: "969e70e9-84f5-46fd-814f-4fc7c06a6394"). InnerVolumeSpecName "kube-api-access-r9l4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.703207 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9l4r\" (UniqueName: \"kubernetes.io/projected/969e70e9-84f5-46fd-814f-4fc7c06a6394-kube-api-access-r9l4r\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.703250 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/969e70e9-84f5-46fd-814f-4fc7c06a6394-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:03 crc kubenswrapper[4795]: I0310 15:45:03.703260 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/969e70e9-84f5-46fd-814f-4fc7c06a6394-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:04 crc kubenswrapper[4795]: I0310 15:45:04.156570 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" event={"ID":"969e70e9-84f5-46fd-814f-4fc7c06a6394","Type":"ContainerDied","Data":"d1ed2e4127d5cfce29becf3289495e9c0051cc6ed5537f1cb8e405003337f7a1"} Mar 10 15:45:04 crc kubenswrapper[4795]: I0310 15:45:04.156918 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1ed2e4127d5cfce29becf3289495e9c0051cc6ed5537f1cb8e405003337f7a1" Mar 10 15:45:04 crc kubenswrapper[4795]: I0310 15:45:04.157084 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552625-lr2ff" Mar 10 15:45:04 crc kubenswrapper[4795]: I0310 15:45:04.565171 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn"] Mar 10 15:45:04 crc kubenswrapper[4795]: I0310 15:45:04.574780 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552580-v6pjn"] Mar 10 15:45:05 crc kubenswrapper[4795]: I0310 15:45:05.496144 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6c8d10-50e2-458e-a3fd-0b67c039c705" path="/var/lib/kubelet/pods/6b6c8d10-50e2-458e-a3fd-0b67c039c705/volumes" Mar 10 15:45:12 crc kubenswrapper[4795]: I0310 15:45:12.477189 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:45:12 crc kubenswrapper[4795]: E0310 15:45:12.478145 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:45:23 crc kubenswrapper[4795]: I0310 15:45:23.477183 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:45:23 crc kubenswrapper[4795]: E0310 15:45:23.478576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:45:35 crc kubenswrapper[4795]: I0310 15:45:35.476567 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:45:35 crc kubenswrapper[4795]: E0310 15:45:35.477350 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:45:41 crc kubenswrapper[4795]: I0310 15:45:41.516637 4795 generic.go:334] "Generic (PLEG): container finished" podID="e98c0b12-b750-4125-b3c6-170a91a0aa0e" containerID="249e1f7a36e9630d6ff6760ccfaa6e15ae57b715be807cb9d1137163553fe22e" exitCode=0 Mar 10 15:45:41 crc kubenswrapper[4795]: I0310 15:45:41.516706 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" event={"ID":"e98c0b12-b750-4125-b3c6-170a91a0aa0e","Type":"ContainerDied","Data":"249e1f7a36e9630d6ff6760ccfaa6e15ae57b715be807cb9d1137163553fe22e"} Mar 10 15:45:42 crc kubenswrapper[4795]: I0310 15:45:42.947702 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.108589 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29pjl\" (UniqueName: \"kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl\") pod \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.108733 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory\") pod \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.108797 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle\") pod \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.108827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam\") pod \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.108859 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0\") pod \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\" (UID: \"e98c0b12-b750-4125-b3c6-170a91a0aa0e\") " Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.115043 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl" (OuterVolumeSpecName: "kube-api-access-29pjl") pod "e98c0b12-b750-4125-b3c6-170a91a0aa0e" (UID: "e98c0b12-b750-4125-b3c6-170a91a0aa0e"). InnerVolumeSpecName "kube-api-access-29pjl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.117757 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e98c0b12-b750-4125-b3c6-170a91a0aa0e" (UID: "e98c0b12-b750-4125-b3c6-170a91a0aa0e"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.134990 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e98c0b12-b750-4125-b3c6-170a91a0aa0e" (UID: "e98c0b12-b750-4125-b3c6-170a91a0aa0e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.146738 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory" (OuterVolumeSpecName: "inventory") pod "e98c0b12-b750-4125-b3c6-170a91a0aa0e" (UID: "e98c0b12-b750-4125-b3c6-170a91a0aa0e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.153289 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "e98c0b12-b750-4125-b3c6-170a91a0aa0e" (UID: "e98c0b12-b750-4125-b3c6-170a91a0aa0e"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.211843 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.211885 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.211901 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.211913 4795 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/e98c0b12-b750-4125-b3c6-170a91a0aa0e-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.211928 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29pjl\" (UniqueName: \"kubernetes.io/projected/e98c0b12-b750-4125-b3c6-170a91a0aa0e-kube-api-access-29pjl\") on node \"crc\" DevicePath \"\"" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.533756 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" event={"ID":"e98c0b12-b750-4125-b3c6-170a91a0aa0e","Type":"ContainerDied","Data":"a22a3d81c16035a248ec5b4d8ca366ba4247df2aedb9293ee7e490944650f961"} Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.533800 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a22a3d81c16035a248ec5b4d8ca366ba4247df2aedb9293ee7e490944650f961" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.533838 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.627547 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt"] Mar 10 15:45:43 crc kubenswrapper[4795]: E0310 15:45:43.627990 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="969e70e9-84f5-46fd-814f-4fc7c06a6394" containerName="collect-profiles" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.628030 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="969e70e9-84f5-46fd-814f-4fc7c06a6394" containerName="collect-profiles" Mar 10 15:45:43 crc kubenswrapper[4795]: E0310 15:45:43.628090 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e98c0b12-b750-4125-b3c6-170a91a0aa0e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.628101 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="e98c0b12-b750-4125-b3c6-170a91a0aa0e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.628324 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="e98c0b12-b750-4125-b3c6-170a91a0aa0e" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.628346 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="969e70e9-84f5-46fd-814f-4fc7c06a6394" containerName="collect-profiles" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.629106 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.638433 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.639217 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.639408 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.639570 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.639720 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.640365 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.640484 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.645572 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt"] Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.722695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723045 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723114 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723141 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723180 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723243 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723272 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhdzd\" (UniqueName: \"kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723401 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723423 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.723441 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824388 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824431 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824458 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824510 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824533 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhdzd\" (UniqueName: \"kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824563 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824604 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824621 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824636 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824685 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.824701 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.826362 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.828541 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.828563 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.828805 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.829162 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.829296 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.829302 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.829994 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.833696 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.833843 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.845834 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhdzd\" (UniqueName: \"kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-snhgt\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:43 crc kubenswrapper[4795]: I0310 15:45:43.971173 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:45:44 crc kubenswrapper[4795]: I0310 15:45:44.534571 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt"] Mar 10 15:45:44 crc kubenswrapper[4795]: W0310 15:45:44.552622 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b65855_20a1_4f05_9d95_89cc7a05baaa.slice/crio-831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a WatchSource:0}: Error finding container 831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a: Status 404 returned error can't find the container with id 831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a Mar 10 15:45:45 crc kubenswrapper[4795]: I0310 15:45:45.559182 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" event={"ID":"c8b65855-20a1-4f05-9d95-89cc7a05baaa","Type":"ContainerStarted","Data":"831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a"} Mar 10 15:45:45 crc kubenswrapper[4795]: I0310 15:45:45.706831 4795 scope.go:117] "RemoveContainer" containerID="0d6d43abbd7b03d20bfd1f879854e7939f893d7aea104c9c2362ee9fecc27326" Mar 10 15:45:46 crc kubenswrapper[4795]: I0310 15:45:46.568591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" event={"ID":"c8b65855-20a1-4f05-9d95-89cc7a05baaa","Type":"ContainerStarted","Data":"79a4b518b449a11e7a47947c4867cb563f2c46333fad220382c4647eb11e6921"} Mar 10 15:45:46 crc kubenswrapper[4795]: I0310 15:45:46.605888 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" podStartSLOduration=2.77337458 podStartE2EDuration="3.605867759s" podCreationTimestamp="2026-03-10 15:45:43 +0000 UTC" firstStartedPulling="2026-03-10 15:45:44.558103931 +0000 UTC m=+2377.723844829" lastFinishedPulling="2026-03-10 15:45:45.39059712 +0000 UTC m=+2378.556338008" observedRunningTime="2026-03-10 15:45:46.594107503 +0000 UTC m=+2379.759848411" watchObservedRunningTime="2026-03-10 15:45:46.605867759 +0000 UTC m=+2379.771608657" Mar 10 15:45:47 crc kubenswrapper[4795]: I0310 15:45:47.503873 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:45:47 crc kubenswrapper[4795]: E0310 15:45:47.505001 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.137115 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552626-ql86v"] Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.139488 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.141732 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.141751 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.142438 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.147892 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-ql86v"] Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.271313 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhnsn\" (UniqueName: \"kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn\") pod \"auto-csr-approver-29552626-ql86v\" (UID: \"5f87be56-5142-4c58-a84a-10d641bd37a9\") " pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.377585 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhnsn\" (UniqueName: \"kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn\") pod \"auto-csr-approver-29552626-ql86v\" (UID: \"5f87be56-5142-4c58-a84a-10d641bd37a9\") " pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.410954 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhnsn\" (UniqueName: \"kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn\") pod \"auto-csr-approver-29552626-ql86v\" (UID: \"5f87be56-5142-4c58-a84a-10d641bd37a9\") " pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.465006 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.477443 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:46:00 crc kubenswrapper[4795]: E0310 15:46:00.477726 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:46:00 crc kubenswrapper[4795]: I0310 15:46:00.902199 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-ql86v"] Mar 10 15:46:01 crc kubenswrapper[4795]: I0310 15:46:01.712808 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-ql86v" event={"ID":"5f87be56-5142-4c58-a84a-10d641bd37a9","Type":"ContainerStarted","Data":"a7bfba19d7291786ce94f983124118a2e43fe3b55982b2aff82862ba07e56ef1"} Mar 10 15:46:03 crc kubenswrapper[4795]: I0310 15:46:03.732395 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f87be56-5142-4c58-a84a-10d641bd37a9" containerID="c9683c32df4e647bac039fd5aed5867ca019063f8d4a28caab16d3f75cc536a5" exitCode=0 Mar 10 15:46:03 crc kubenswrapper[4795]: I0310 15:46:03.732729 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-ql86v" event={"ID":"5f87be56-5142-4c58-a84a-10d641bd37a9","Type":"ContainerDied","Data":"c9683c32df4e647bac039fd5aed5867ca019063f8d4a28caab16d3f75cc536a5"} Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.100706 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.273186 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhnsn\" (UniqueName: \"kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn\") pod \"5f87be56-5142-4c58-a84a-10d641bd37a9\" (UID: \"5f87be56-5142-4c58-a84a-10d641bd37a9\") " Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.278638 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn" (OuterVolumeSpecName: "kube-api-access-zhnsn") pod "5f87be56-5142-4c58-a84a-10d641bd37a9" (UID: "5f87be56-5142-4c58-a84a-10d641bd37a9"). InnerVolumeSpecName "kube-api-access-zhnsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.377730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhnsn\" (UniqueName: \"kubernetes.io/projected/5f87be56-5142-4c58-a84a-10d641bd37a9-kube-api-access-zhnsn\") on node \"crc\" DevicePath \"\"" Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.755180 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552626-ql86v" event={"ID":"5f87be56-5142-4c58-a84a-10d641bd37a9","Type":"ContainerDied","Data":"a7bfba19d7291786ce94f983124118a2e43fe3b55982b2aff82862ba07e56ef1"} Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.755227 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7bfba19d7291786ce94f983124118a2e43fe3b55982b2aff82862ba07e56ef1" Mar 10 15:46:05 crc kubenswrapper[4795]: I0310 15:46:05.755231 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552626-ql86v" Mar 10 15:46:06 crc kubenswrapper[4795]: I0310 15:46:06.188961 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-h77s6"] Mar 10 15:46:06 crc kubenswrapper[4795]: I0310 15:46:06.195301 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552620-h77s6"] Mar 10 15:46:07 crc kubenswrapper[4795]: I0310 15:46:07.492117 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8296d71-9097-46ac-9de4-9274c0bd4a5d" path="/var/lib/kubelet/pods/d8296d71-9097-46ac-9de4-9274c0bd4a5d/volumes" Mar 10 15:46:14 crc kubenswrapper[4795]: I0310 15:46:14.477236 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:46:14 crc kubenswrapper[4795]: E0310 15:46:14.478430 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:46:27 crc kubenswrapper[4795]: I0310 15:46:27.486372 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:46:27 crc kubenswrapper[4795]: E0310 15:46:27.487237 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:46:41 crc kubenswrapper[4795]: I0310 15:46:41.477030 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:46:41 crc kubenswrapper[4795]: E0310 15:46:41.478044 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:46:45 crc kubenswrapper[4795]: I0310 15:46:45.786777 4795 scope.go:117] "RemoveContainer" containerID="cc8a27f8f3777dba39c04267b9f2ca3458d06d7e0fd5ecaaa7d8b696d8e319c9" Mar 10 15:46:54 crc kubenswrapper[4795]: I0310 15:46:54.477889 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:46:54 crc kubenswrapper[4795]: E0310 15:46:54.479258 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:47:05 crc kubenswrapper[4795]: I0310 15:47:05.476238 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:47:05 crc kubenswrapper[4795]: E0310 15:47:05.477335 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:47:20 crc kubenswrapper[4795]: I0310 15:47:20.476965 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:47:20 crc kubenswrapper[4795]: E0310 15:47:20.477806 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:47:32 crc kubenswrapper[4795]: I0310 15:47:32.476952 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:47:32 crc kubenswrapper[4795]: E0310 15:47:32.478037 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:47:47 crc kubenswrapper[4795]: I0310 15:47:47.483768 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:47:47 crc kubenswrapper[4795]: E0310 15:47:47.484410 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.148282 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552628-z9q5g"] Mar 10 15:48:00 crc kubenswrapper[4795]: E0310 15:48:00.150926 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f87be56-5142-4c58-a84a-10d641bd37a9" containerName="oc" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.150949 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f87be56-5142-4c58-a84a-10d641bd37a9" containerName="oc" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.151164 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f87be56-5142-4c58-a84a-10d641bd37a9" containerName="oc" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.151908 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.159296 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.159326 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.159326 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.159894 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-z9q5g"] Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.247738 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5nbw\" (UniqueName: \"kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw\") pod \"auto-csr-approver-29552628-z9q5g\" (UID: \"622392be-1c48-4efe-922c-d816093d9798\") " pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.352241 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5nbw\" (UniqueName: \"kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw\") pod \"auto-csr-approver-29552628-z9q5g\" (UID: \"622392be-1c48-4efe-922c-d816093d9798\") " pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.380711 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5nbw\" (UniqueName: \"kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw\") pod \"auto-csr-approver-29552628-z9q5g\" (UID: \"622392be-1c48-4efe-922c-d816093d9798\") " pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.469773 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.974825 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-z9q5g"] Mar 10 15:48:00 crc kubenswrapper[4795]: I0310 15:48:00.982235 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:48:01 crc kubenswrapper[4795]: I0310 15:48:01.476654 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:48:01 crc kubenswrapper[4795]: E0310 15:48:01.477126 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:48:01 crc kubenswrapper[4795]: I0310 15:48:01.849487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" event={"ID":"622392be-1c48-4efe-922c-d816093d9798","Type":"ContainerStarted","Data":"1e7a5cfc3f90edd60b8a44be6abea2d62728e15c42e8bba6fde99022d72424a5"} Mar 10 15:48:02 crc kubenswrapper[4795]: I0310 15:48:02.861926 4795 generic.go:334] "Generic (PLEG): container finished" podID="622392be-1c48-4efe-922c-d816093d9798" containerID="0945381c8190a3b6b80b5a52f8c0770a68bbf2b83241137520aacb4f333832da" exitCode=0 Mar 10 15:48:02 crc kubenswrapper[4795]: I0310 15:48:02.861980 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" event={"ID":"622392be-1c48-4efe-922c-d816093d9798","Type":"ContainerDied","Data":"0945381c8190a3b6b80b5a52f8c0770a68bbf2b83241137520aacb4f333832da"} Mar 10 15:48:03 crc kubenswrapper[4795]: I0310 15:48:03.873655 4795 generic.go:334] "Generic (PLEG): container finished" podID="c8b65855-20a1-4f05-9d95-89cc7a05baaa" containerID="79a4b518b449a11e7a47947c4867cb563f2c46333fad220382c4647eb11e6921" exitCode=0 Mar 10 15:48:03 crc kubenswrapper[4795]: I0310 15:48:03.873721 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" event={"ID":"c8b65855-20a1-4f05-9d95-89cc7a05baaa","Type":"ContainerDied","Data":"79a4b518b449a11e7a47947c4867cb563f2c46333fad220382c4647eb11e6921"} Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.202688 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.337424 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5nbw\" (UniqueName: \"kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw\") pod \"622392be-1c48-4efe-922c-d816093d9798\" (UID: \"622392be-1c48-4efe-922c-d816093d9798\") " Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.343108 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw" (OuterVolumeSpecName: "kube-api-access-l5nbw") pod "622392be-1c48-4efe-922c-d816093d9798" (UID: "622392be-1c48-4efe-922c-d816093d9798"). InnerVolumeSpecName "kube-api-access-l5nbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.439561 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5nbw\" (UniqueName: \"kubernetes.io/projected/622392be-1c48-4efe-922c-d816093d9798-kube-api-access-l5nbw\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.887293 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" event={"ID":"622392be-1c48-4efe-922c-d816093d9798","Type":"ContainerDied","Data":"1e7a5cfc3f90edd60b8a44be6abea2d62728e15c42e8bba6fde99022d72424a5"} Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.887374 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7a5cfc3f90edd60b8a44be6abea2d62728e15c42e8bba6fde99022d72424a5" Mar 10 15:48:04 crc kubenswrapper[4795]: I0310 15:48:04.887405 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552628-z9q5g" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.274164 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-mvzps"] Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.283474 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552622-mvzps"] Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.409187 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.469827 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.470554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.470711 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.470794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.470910 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.471097 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhdzd\" (UniqueName: \"kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.471199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.471341 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.472435 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.472584 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.472699 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory\") pod \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\" (UID: \"c8b65855-20a1-4f05-9d95-89cc7a05baaa\") " Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.482500 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd" (OuterVolumeSpecName: "kube-api-access-rhdzd") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "kube-api-access-rhdzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.493845 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.494437 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93a6c803-148d-4abd-bca2-dab66a3a155e" path="/var/lib/kubelet/pods/93a6c803-148d-4abd-bca2-dab66a3a155e/volumes" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.506184 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.507280 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.510898 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.523287 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.524414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.525361 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.528368 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory" (OuterVolumeSpecName: "inventory") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.528957 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.540129 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c8b65855-20a1-4f05-9d95-89cc7a05baaa" (UID: "c8b65855-20a1-4f05-9d95-89cc7a05baaa"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575519 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575561 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575574 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575586 4795 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575596 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhdzd\" (UniqueName: \"kubernetes.io/projected/c8b65855-20a1-4f05-9d95-89cc7a05baaa-kube-api-access-rhdzd\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575606 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575619 4795 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575634 4795 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575646 4795 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575661 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.575672 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c8b65855-20a1-4f05-9d95-89cc7a05baaa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.902196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" event={"ID":"c8b65855-20a1-4f05-9d95-89cc7a05baaa","Type":"ContainerDied","Data":"831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a"} Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.902239 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="831944549a7c8a241539a62f53131526005447974bb886a14bab70a140f3152a" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.902271 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-snhgt" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.998858 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl"] Mar 10 15:48:05 crc kubenswrapper[4795]: E0310 15:48:05.999623 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8b65855-20a1-4f05-9d95-89cc7a05baaa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.999649 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8b65855-20a1-4f05-9d95-89cc7a05baaa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:48:05 crc kubenswrapper[4795]: E0310 15:48:05.999688 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="622392be-1c48-4efe-922c-d816093d9798" containerName="oc" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.999698 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="622392be-1c48-4efe-922c-d816093d9798" containerName="oc" Mar 10 15:48:05 crc kubenswrapper[4795]: I0310 15:48:05.999926 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8b65855-20a1-4f05-9d95-89cc7a05baaa" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:05.999952 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="622392be-1c48-4efe-922c-d816093d9798" containerName="oc" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.000784 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.005353 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-mxwzj" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.008329 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.008384 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.008431 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.008468 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.013286 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl"] Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.084908 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.084977 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.085040 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.085129 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.085266 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp8l\" (UniqueName: \"kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.085338 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.085486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187277 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187336 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187368 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187414 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187442 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp8l\" (UniqueName: \"kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187466 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.187507 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.192250 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.192270 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.192253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.192253 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.193610 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.193902 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.206000 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp8l\" (UniqueName: \"kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.316227 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.843792 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl"] Mar 10 15:48:06 crc kubenswrapper[4795]: I0310 15:48:06.911420 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" event={"ID":"91b3b9ad-6b02-4647-b3b6-1789e0237c73","Type":"ContainerStarted","Data":"69260ef5a10e3803cecf672a48eb87aabdfdd49cc6ed72a8502b122a955942e8"} Mar 10 15:48:07 crc kubenswrapper[4795]: I0310 15:48:07.950547 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" event={"ID":"91b3b9ad-6b02-4647-b3b6-1789e0237c73","Type":"ContainerStarted","Data":"6b1cb005e81b20788927cb5193b737be0006f93feab9f3cf32f7e690b0b8f40d"} Mar 10 15:48:07 crc kubenswrapper[4795]: I0310 15:48:07.975183 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" podStartSLOduration=2.458004309 podStartE2EDuration="2.975159573s" podCreationTimestamp="2026-03-10 15:48:05 +0000 UTC" firstStartedPulling="2026-03-10 15:48:06.847562415 +0000 UTC m=+2520.013303313" lastFinishedPulling="2026-03-10 15:48:07.364717679 +0000 UTC m=+2520.530458577" observedRunningTime="2026-03-10 15:48:07.971719755 +0000 UTC m=+2521.137460663" watchObservedRunningTime="2026-03-10 15:48:07.975159573 +0000 UTC m=+2521.140900471" Mar 10 15:48:15 crc kubenswrapper[4795]: I0310 15:48:15.477612 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:48:15 crc kubenswrapper[4795]: E0310 15:48:15.478101 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:48:28 crc kubenswrapper[4795]: I0310 15:48:28.479120 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:48:28 crc kubenswrapper[4795]: E0310 15:48:28.479926 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:48:40 crc kubenswrapper[4795]: I0310 15:48:40.477575 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:48:40 crc kubenswrapper[4795]: E0310 15:48:40.479951 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:48:45 crc kubenswrapper[4795]: I0310 15:48:45.916985 4795 scope.go:117] "RemoveContainer" containerID="ecc897a71e7c1e253a2dfa4e270f3a556d93f3fccfebda126b74442882d9ceba" Mar 10 15:48:52 crc kubenswrapper[4795]: I0310 15:48:52.475877 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:48:52 crc kubenswrapper[4795]: E0310 15:48:52.476801 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:49:07 crc kubenswrapper[4795]: I0310 15:49:07.487765 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:49:07 crc kubenswrapper[4795]: E0310 15:49:07.488558 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:49:18 crc kubenswrapper[4795]: I0310 15:49:18.476872 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:49:18 crc kubenswrapper[4795]: E0310 15:49:18.477584 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:49:32 crc kubenswrapper[4795]: I0310 15:49:32.476554 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:49:32 crc kubenswrapper[4795]: E0310 15:49:32.477421 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:49:45 crc kubenswrapper[4795]: I0310 15:49:45.477646 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:49:45 crc kubenswrapper[4795]: E0310 15:49:45.478581 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:49:58 crc kubenswrapper[4795]: I0310 15:49:58.477264 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:49:59 crc kubenswrapper[4795]: I0310 15:49:59.032598 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282"} Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.150022 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552630-zxksk"] Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.151617 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.153913 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.158996 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.160159 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.169490 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-zxksk"] Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.197389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nf9s\" (UniqueName: \"kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s\") pod \"auto-csr-approver-29552630-zxksk\" (UID: \"51b4df05-e41d-4693-bb6c-eee112f59a9c\") " pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.299249 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nf9s\" (UniqueName: \"kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s\") pod \"auto-csr-approver-29552630-zxksk\" (UID: \"51b4df05-e41d-4693-bb6c-eee112f59a9c\") " pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.316447 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nf9s\" (UniqueName: \"kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s\") pod \"auto-csr-approver-29552630-zxksk\" (UID: \"51b4df05-e41d-4693-bb6c-eee112f59a9c\") " pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:00 crc kubenswrapper[4795]: I0310 15:50:00.471191 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:01 crc kubenswrapper[4795]: W0310 15:50:00.923348 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51b4df05_e41d_4693_bb6c_eee112f59a9c.slice/crio-87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2 WatchSource:0}: Error finding container 87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2: Status 404 returned error can't find the container with id 87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2 Mar 10 15:50:01 crc kubenswrapper[4795]: I0310 15:50:00.936244 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-zxksk"] Mar 10 15:50:01 crc kubenswrapper[4795]: I0310 15:50:01.050802 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-zxksk" event={"ID":"51b4df05-e41d-4693-bb6c-eee112f59a9c","Type":"ContainerStarted","Data":"87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2"} Mar 10 15:50:03 crc kubenswrapper[4795]: I0310 15:50:03.073164 4795 generic.go:334] "Generic (PLEG): container finished" podID="51b4df05-e41d-4693-bb6c-eee112f59a9c" containerID="3372817b4fff5806bd02648d21494a95d71bc765e20d1a9227f23e7084b8b696" exitCode=0 Mar 10 15:50:03 crc kubenswrapper[4795]: I0310 15:50:03.073224 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-zxksk" event={"ID":"51b4df05-e41d-4693-bb6c-eee112f59a9c","Type":"ContainerDied","Data":"3372817b4fff5806bd02648d21494a95d71bc765e20d1a9227f23e7084b8b696"} Mar 10 15:50:04 crc kubenswrapper[4795]: I0310 15:50:04.397432 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:04 crc kubenswrapper[4795]: I0310 15:50:04.481737 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nf9s\" (UniqueName: \"kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s\") pod \"51b4df05-e41d-4693-bb6c-eee112f59a9c\" (UID: \"51b4df05-e41d-4693-bb6c-eee112f59a9c\") " Mar 10 15:50:04 crc kubenswrapper[4795]: I0310 15:50:04.487631 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s" (OuterVolumeSpecName: "kube-api-access-6nf9s") pod "51b4df05-e41d-4693-bb6c-eee112f59a9c" (UID: "51b4df05-e41d-4693-bb6c-eee112f59a9c"). InnerVolumeSpecName "kube-api-access-6nf9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:50:04 crc kubenswrapper[4795]: I0310 15:50:04.584118 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nf9s\" (UniqueName: \"kubernetes.io/projected/51b4df05-e41d-4693-bb6c-eee112f59a9c-kube-api-access-6nf9s\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.089093 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552630-zxksk" event={"ID":"51b4df05-e41d-4693-bb6c-eee112f59a9c","Type":"ContainerDied","Data":"87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2"} Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.089137 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87154918816ed4ce57d63ff8257a48d87528f0c42535d83271e00ac8260ce5c2" Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.089177 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552630-zxksk" Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.463484 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-qdjg5"] Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.471620 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552624-qdjg5"] Mar 10 15:50:05 crc kubenswrapper[4795]: I0310 15:50:05.498251 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6931d53a-1555-45b7-9191-978aaf51871f" path="/var/lib/kubelet/pods/6931d53a-1555-45b7-9191-978aaf51871f/volumes" Mar 10 15:50:23 crc kubenswrapper[4795]: I0310 15:50:23.249789 4795 generic.go:334] "Generic (PLEG): container finished" podID="91b3b9ad-6b02-4647-b3b6-1789e0237c73" containerID="6b1cb005e81b20788927cb5193b737be0006f93feab9f3cf32f7e690b0b8f40d" exitCode=0 Mar 10 15:50:23 crc kubenswrapper[4795]: I0310 15:50:23.249922 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" event={"ID":"91b3b9ad-6b02-4647-b3b6-1789e0237c73","Type":"ContainerDied","Data":"6b1cb005e81b20788927cb5193b737be0006f93feab9f3cf32f7e690b0b8f40d"} Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.642195 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659542 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659639 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659702 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659796 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659849 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659925 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp8l\" (UniqueName: \"kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.659975 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0\") pod \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\" (UID: \"91b3b9ad-6b02-4647-b3b6-1789e0237c73\") " Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.674682 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.675674 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l" (OuterVolumeSpecName: "kube-api-access-pzp8l") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "kube-api-access-pzp8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.696159 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.699777 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory" (OuterVolumeSpecName: "inventory") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.707360 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.719151 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.726063 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "91b3b9ad-6b02-4647-b3b6-1789e0237c73" (UID: "91b3b9ad-6b02-4647-b3b6-1789e0237c73"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761330 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761394 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzp8l\" (UniqueName: \"kubernetes.io/projected/91b3b9ad-6b02-4647-b3b6-1789e0237c73-kube-api-access-pzp8l\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761406 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761418 4795 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761428 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761438 4795 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-inventory\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:24 crc kubenswrapper[4795]: I0310 15:50:24.761450 4795 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91b3b9ad-6b02-4647-b3b6-1789e0237c73-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 15:50:25 crc kubenswrapper[4795]: I0310 15:50:25.269212 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" event={"ID":"91b3b9ad-6b02-4647-b3b6-1789e0237c73","Type":"ContainerDied","Data":"69260ef5a10e3803cecf672a48eb87aabdfdd49cc6ed72a8502b122a955942e8"} Mar 10 15:50:25 crc kubenswrapper[4795]: I0310 15:50:25.269261 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69260ef5a10e3803cecf672a48eb87aabdfdd49cc6ed72a8502b122a955942e8" Mar 10 15:50:25 crc kubenswrapper[4795]: I0310 15:50:25.269344 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl" Mar 10 15:50:46 crc kubenswrapper[4795]: I0310 15:50:46.043942 4795 scope.go:117] "RemoveContainer" containerID="0286b35b4cf1b65b8a1593c79b84f1086ba4c73235386ef1aca3ee6c6289d210" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.988915 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:51:06 crc kubenswrapper[4795]: E0310 15:51:06.990819 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b4df05-e41d-4693-bb6c-eee112f59a9c" containerName="oc" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.990842 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b4df05-e41d-4693-bb6c-eee112f59a9c" containerName="oc" Mar 10 15:51:06 crc kubenswrapper[4795]: E0310 15:51:06.990869 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91b3b9ad-6b02-4647-b3b6-1789e0237c73" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.990879 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="91b3b9ad-6b02-4647-b3b6-1789e0237c73" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.991098 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="91b3b9ad-6b02-4647-b3b6-1789e0237c73" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.991121 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b4df05-e41d-4693-bb6c-eee112f59a9c" containerName="oc" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.992123 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.994840 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.994987 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rtrl5" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.995518 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 15:51:06 crc kubenswrapper[4795]: I0310 15:51:06.996799 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.021709 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.069402 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.069575 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.069695 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.171751 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.171874 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.171910 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpl64\" (UniqueName: \"kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.172041 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.172188 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.172392 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.172522 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.172617 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.173602 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.173712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.176931 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.181738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.275886 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276017 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276056 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpl64\" (UniqueName: \"kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276126 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276198 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276245 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.276697 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.277052 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.277283 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.285059 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.285299 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.300204 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpl64\" (UniqueName: \"kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.315027 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"tempest-tests-tempest\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.319510 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 15:51:07 crc kubenswrapper[4795]: I0310 15:51:07.769217 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 10 15:51:08 crc kubenswrapper[4795]: I0310 15:51:08.657843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"865ec795-fb93-4628-bdf5-5451ffbf2c0c","Type":"ContainerStarted","Data":"8f81bc7e2b4ea553e06db5d4ce8851bc84e974f7851711438ff9818c0b6c0dfe"} Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.257127 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.260962 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.273173 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.379309 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.379572 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.379893 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw2vk\" (UniqueName: \"kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.481755 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw2vk\" (UniqueName: \"kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.481843 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.481955 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.482660 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.482710 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.507525 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw2vk\" (UniqueName: \"kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk\") pod \"redhat-operators-5bzk2\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:12 crc kubenswrapper[4795]: I0310 15:51:12.587806 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:13 crc kubenswrapper[4795]: I0310 15:51:13.621186 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:51:13 crc kubenswrapper[4795]: I0310 15:51:13.725606 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerStarted","Data":"2e0b53bb371d092cb032309fad039549b2ef4dbee65118e4c0bede69c92d03ee"} Mar 10 15:51:14 crc kubenswrapper[4795]: I0310 15:51:14.735686 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerID="1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229" exitCode=0 Mar 10 15:51:14 crc kubenswrapper[4795]: I0310 15:51:14.735741 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerDied","Data":"1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229"} Mar 10 15:51:16 crc kubenswrapper[4795]: I0310 15:51:16.759869 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerStarted","Data":"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82"} Mar 10 15:51:19 crc kubenswrapper[4795]: I0310 15:51:19.813436 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerID="80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82" exitCode=0 Mar 10 15:51:19 crc kubenswrapper[4795]: I0310 15:51:19.813512 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerDied","Data":"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82"} Mar 10 15:51:34 crc kubenswrapper[4795]: E0310 15:51:34.473595 4795 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 10 15:51:34 crc kubenswrapper[4795]: E0310 15:51:34.474704 4795 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jpl64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(865ec795-fb93-4628-bdf5-5451ffbf2c0c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 10 15:51:34 crc kubenswrapper[4795]: E0310 15:51:34.475901 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" Mar 10 15:51:34 crc kubenswrapper[4795]: E0310 15:51:34.975612 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" Mar 10 15:51:35 crc kubenswrapper[4795]: I0310 15:51:35.974214 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerStarted","Data":"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017"} Mar 10 15:51:35 crc kubenswrapper[4795]: I0310 15:51:35.999680 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5bzk2" podStartSLOduration=3.975071686 podStartE2EDuration="23.999657018s" podCreationTimestamp="2026-03-10 15:51:12 +0000 UTC" firstStartedPulling="2026-03-10 15:51:14.737827841 +0000 UTC m=+2707.903568739" lastFinishedPulling="2026-03-10 15:51:34.762413133 +0000 UTC m=+2727.928154071" observedRunningTime="2026-03-10 15:51:35.992098413 +0000 UTC m=+2729.157839311" watchObservedRunningTime="2026-03-10 15:51:35.999657018 +0000 UTC m=+2729.165397916" Mar 10 15:51:42 crc kubenswrapper[4795]: I0310 15:51:42.588260 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:42 crc kubenswrapper[4795]: I0310 15:51:42.588826 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:51:43 crc kubenswrapper[4795]: I0310 15:51:43.631868 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bzk2" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" probeResult="failure" output=< Mar 10 15:51:43 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:51:43 crc kubenswrapper[4795]: > Mar 10 15:51:50 crc kubenswrapper[4795]: I0310 15:51:50.116586 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 10 15:51:52 crc kubenswrapper[4795]: I0310 15:51:52.123394 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"865ec795-fb93-4628-bdf5-5451ffbf2c0c","Type":"ContainerStarted","Data":"a8ffb6484fa57ec3651bec64ae7bbcbd3811d920b206e8f56690a656761f884a"} Mar 10 15:51:53 crc kubenswrapper[4795]: I0310 15:51:53.656809 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bzk2" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" probeResult="failure" output=< Mar 10 15:51:53 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:51:53 crc kubenswrapper[4795]: > Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.135979 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=12.797718464999999 podStartE2EDuration="55.135958643s" podCreationTimestamp="2026-03-10 15:51:05 +0000 UTC" firstStartedPulling="2026-03-10 15:51:07.774179181 +0000 UTC m=+2700.939920079" lastFinishedPulling="2026-03-10 15:51:50.112419329 +0000 UTC m=+2743.278160257" observedRunningTime="2026-03-10 15:51:52.146475138 +0000 UTC m=+2745.312216076" watchObservedRunningTime="2026-03-10 15:52:00.135958643 +0000 UTC m=+2753.301699541" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.147892 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xl7lb"] Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.149236 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.151995 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.152166 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.152268 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.159191 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xl7lb"] Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.175516 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6q7t\" (UniqueName: \"kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t\") pod \"auto-csr-approver-29552632-xl7lb\" (UID: \"39dce396-78f7-40b5-b4db-abe0bbc524c7\") " pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.277162 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6q7t\" (UniqueName: \"kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t\") pod \"auto-csr-approver-29552632-xl7lb\" (UID: \"39dce396-78f7-40b5-b4db-abe0bbc524c7\") " pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.298194 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6q7t\" (UniqueName: \"kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t\") pod \"auto-csr-approver-29552632-xl7lb\" (UID: \"39dce396-78f7-40b5-b4db-abe0bbc524c7\") " pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.474144 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:00 crc kubenswrapper[4795]: I0310 15:52:00.932110 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xl7lb"] Mar 10 15:52:01 crc kubenswrapper[4795]: I0310 15:52:01.204590 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" event={"ID":"39dce396-78f7-40b5-b4db-abe0bbc524c7","Type":"ContainerStarted","Data":"03cb8ad845012a1c08235329d49bc8876201a5baa3756a6829224a2824b5afbf"} Mar 10 15:52:03 crc kubenswrapper[4795]: I0310 15:52:03.224345 4795 generic.go:334] "Generic (PLEG): container finished" podID="39dce396-78f7-40b5-b4db-abe0bbc524c7" containerID="0fac467e277e0fe3b940d02fa772a8c4630a4d90d114cf740f7372ab4914b0ca" exitCode=0 Mar 10 15:52:03 crc kubenswrapper[4795]: I0310 15:52:03.224410 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" event={"ID":"39dce396-78f7-40b5-b4db-abe0bbc524c7","Type":"ContainerDied","Data":"0fac467e277e0fe3b940d02fa772a8c4630a4d90d114cf740f7372ab4914b0ca"} Mar 10 15:52:03 crc kubenswrapper[4795]: I0310 15:52:03.635743 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5bzk2" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" probeResult="failure" output=< Mar 10 15:52:03 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 15:52:03 crc kubenswrapper[4795]: > Mar 10 15:52:04 crc kubenswrapper[4795]: I0310 15:52:04.593426 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:04 crc kubenswrapper[4795]: I0310 15:52:04.661787 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6q7t\" (UniqueName: \"kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t\") pod \"39dce396-78f7-40b5-b4db-abe0bbc524c7\" (UID: \"39dce396-78f7-40b5-b4db-abe0bbc524c7\") " Mar 10 15:52:04 crc kubenswrapper[4795]: I0310 15:52:04.667822 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t" (OuterVolumeSpecName: "kube-api-access-k6q7t") pod "39dce396-78f7-40b5-b4db-abe0bbc524c7" (UID: "39dce396-78f7-40b5-b4db-abe0bbc524c7"). InnerVolumeSpecName "kube-api-access-k6q7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:04 crc kubenswrapper[4795]: I0310 15:52:04.765018 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6q7t\" (UniqueName: \"kubernetes.io/projected/39dce396-78f7-40b5-b4db-abe0bbc524c7-kube-api-access-k6q7t\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:05 crc kubenswrapper[4795]: I0310 15:52:05.243772 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" event={"ID":"39dce396-78f7-40b5-b4db-abe0bbc524c7","Type":"ContainerDied","Data":"03cb8ad845012a1c08235329d49bc8876201a5baa3756a6829224a2824b5afbf"} Mar 10 15:52:05 crc kubenswrapper[4795]: I0310 15:52:05.244067 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03cb8ad845012a1c08235329d49bc8876201a5baa3756a6829224a2824b5afbf" Mar 10 15:52:05 crc kubenswrapper[4795]: I0310 15:52:05.243825 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552632-xl7lb" Mar 10 15:52:05 crc kubenswrapper[4795]: I0310 15:52:05.683873 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-ql86v"] Mar 10 15:52:05 crc kubenswrapper[4795]: I0310 15:52:05.697001 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552626-ql86v"] Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.644601 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:06 crc kubenswrapper[4795]: E0310 15:52:06.645139 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39dce396-78f7-40b5-b4db-abe0bbc524c7" containerName="oc" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.645161 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="39dce396-78f7-40b5-b4db-abe0bbc524c7" containerName="oc" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.645365 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="39dce396-78f7-40b5-b4db-abe0bbc524c7" containerName="oc" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.646733 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.676728 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.811169 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.811464 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfdl7\" (UniqueName: \"kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.811532 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.913683 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfdl7\" (UniqueName: \"kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.913749 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.913900 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.914540 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.914749 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.940109 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfdl7\" (UniqueName: \"kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7\") pod \"redhat-marketplace-g24k9\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:06 crc kubenswrapper[4795]: I0310 15:52:06.980840 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:07 crc kubenswrapper[4795]: I0310 15:52:07.493279 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f87be56-5142-4c58-a84a-10d641bd37a9" path="/var/lib/kubelet/pods/5f87be56-5142-4c58-a84a-10d641bd37a9/volumes" Mar 10 15:52:07 crc kubenswrapper[4795]: I0310 15:52:07.501253 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:08 crc kubenswrapper[4795]: I0310 15:52:08.281021 4795 generic.go:334] "Generic (PLEG): container finished" podID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerID="f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348" exitCode=0 Mar 10 15:52:08 crc kubenswrapper[4795]: I0310 15:52:08.281106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerDied","Data":"f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348"} Mar 10 15:52:08 crc kubenswrapper[4795]: I0310 15:52:08.281327 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerStarted","Data":"b6b86507025161d65c6a338cb95b523a554fc2e6a4442ed1b907e6786e17948e"} Mar 10 15:52:09 crc kubenswrapper[4795]: I0310 15:52:09.294591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerStarted","Data":"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb"} Mar 10 15:52:10 crc kubenswrapper[4795]: I0310 15:52:10.310266 4795 generic.go:334] "Generic (PLEG): container finished" podID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerID="8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb" exitCode=0 Mar 10 15:52:10 crc kubenswrapper[4795]: I0310 15:52:10.310347 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerDied","Data":"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb"} Mar 10 15:52:11 crc kubenswrapper[4795]: I0310 15:52:11.324639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerStarted","Data":"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e"} Mar 10 15:52:11 crc kubenswrapper[4795]: I0310 15:52:11.354753 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-g24k9" podStartSLOduration=2.8012439369999997 podStartE2EDuration="5.354731423s" podCreationTimestamp="2026-03-10 15:52:06 +0000 UTC" firstStartedPulling="2026-03-10 15:52:08.283781854 +0000 UTC m=+2761.449522742" lastFinishedPulling="2026-03-10 15:52:10.83726933 +0000 UTC m=+2764.003010228" observedRunningTime="2026-03-10 15:52:11.349662018 +0000 UTC m=+2764.515402916" watchObservedRunningTime="2026-03-10 15:52:11.354731423 +0000 UTC m=+2764.520472321" Mar 10 15:52:12 crc kubenswrapper[4795]: I0310 15:52:12.641441 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:52:12 crc kubenswrapper[4795]: I0310 15:52:12.696101 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.015477 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.354009 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5bzk2" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" containerID="cri-o://c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017" gracePeriod=2 Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.803538 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.868917 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw2vk\" (UniqueName: \"kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk\") pod \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.869174 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities\") pod \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.869199 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content\") pod \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\" (UID: \"4a4315d5-bdc1-4281-9f30-d77b9c31e17b\") " Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.870021 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities" (OuterVolumeSpecName: "utilities") pod "4a4315d5-bdc1-4281-9f30-d77b9c31e17b" (UID: "4a4315d5-bdc1-4281-9f30-d77b9c31e17b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.874386 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk" (OuterVolumeSpecName: "kube-api-access-lw2vk") pod "4a4315d5-bdc1-4281-9f30-d77b9c31e17b" (UID: "4a4315d5-bdc1-4281-9f30-d77b9c31e17b"). InnerVolumeSpecName "kube-api-access-lw2vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.971766 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:14 crc kubenswrapper[4795]: I0310 15:52:14.971806 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw2vk\" (UniqueName: \"kubernetes.io/projected/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-kube-api-access-lw2vk\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.011835 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a4315d5-bdc1-4281-9f30-d77b9c31e17b" (UID: "4a4315d5-bdc1-4281-9f30-d77b9c31e17b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.073781 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a4315d5-bdc1-4281-9f30-d77b9c31e17b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.364490 4795 generic.go:334] "Generic (PLEG): container finished" podID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerID="c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017" exitCode=0 Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.364537 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerDied","Data":"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017"} Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.364567 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5bzk2" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.364586 4795 scope.go:117] "RemoveContainer" containerID="c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.364567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5bzk2" event={"ID":"4a4315d5-bdc1-4281-9f30-d77b9c31e17b","Type":"ContainerDied","Data":"2e0b53bb371d092cb032309fad039549b2ef4dbee65118e4c0bede69c92d03ee"} Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.383344 4795 scope.go:117] "RemoveContainer" containerID="80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.412130 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.421810 4795 scope.go:117] "RemoveContainer" containerID="1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.423998 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5bzk2"] Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.450617 4795 scope.go:117] "RemoveContainer" containerID="c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017" Mar 10 15:52:15 crc kubenswrapper[4795]: E0310 15:52:15.451097 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017\": container with ID starting with c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017 not found: ID does not exist" containerID="c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.451146 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017"} err="failed to get container status \"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017\": rpc error: code = NotFound desc = could not find container \"c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017\": container with ID starting with c055236c632f93b3b63d19593a08a4476f32a7466ce6903e0d0404b474427017 not found: ID does not exist" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.451175 4795 scope.go:117] "RemoveContainer" containerID="80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82" Mar 10 15:52:15 crc kubenswrapper[4795]: E0310 15:52:15.451502 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82\": container with ID starting with 80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82 not found: ID does not exist" containerID="80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.451541 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82"} err="failed to get container status \"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82\": rpc error: code = NotFound desc = could not find container \"80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82\": container with ID starting with 80b39ec2bf363f6b2c4a533d0158ee816c172d115846230bcfbe903dc2db3b82 not found: ID does not exist" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.451569 4795 scope.go:117] "RemoveContainer" containerID="1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229" Mar 10 15:52:15 crc kubenswrapper[4795]: E0310 15:52:15.451799 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229\": container with ID starting with 1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229 not found: ID does not exist" containerID="1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.451829 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229"} err="failed to get container status \"1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229\": rpc error: code = NotFound desc = could not find container \"1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229\": container with ID starting with 1395385568a12756abaf0d3fa041a0b5b7459bfe249797c71ef78243a3205229 not found: ID does not exist" Mar 10 15:52:15 crc kubenswrapper[4795]: I0310 15:52:15.497643 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" path="/var/lib/kubelet/pods/4a4315d5-bdc1-4281-9f30-d77b9c31e17b/volumes" Mar 10 15:52:16 crc kubenswrapper[4795]: I0310 15:52:16.981535 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:16 crc kubenswrapper[4795]: I0310 15:52:16.981594 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:17 crc kubenswrapper[4795]: I0310 15:52:17.033780 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:17 crc kubenswrapper[4795]: I0310 15:52:17.426700 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:18 crc kubenswrapper[4795]: I0310 15:52:18.211623 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:18 crc kubenswrapper[4795]: I0310 15:52:18.538762 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:18 crc kubenswrapper[4795]: I0310 15:52:18.538840 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.410254 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-g24k9" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="registry-server" containerID="cri-o://c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e" gracePeriod=2 Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.852468 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.959250 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfdl7\" (UniqueName: \"kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7\") pod \"08f29d07-f89c-4692-a2db-9499d9bd89a2\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.959568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities\") pod \"08f29d07-f89c-4692-a2db-9499d9bd89a2\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.959686 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content\") pod \"08f29d07-f89c-4692-a2db-9499d9bd89a2\" (UID: \"08f29d07-f89c-4692-a2db-9499d9bd89a2\") " Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.960187 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities" (OuterVolumeSpecName: "utilities") pod "08f29d07-f89c-4692-a2db-9499d9bd89a2" (UID: "08f29d07-f89c-4692-a2db-9499d9bd89a2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:52:19 crc kubenswrapper[4795]: I0310 15:52:19.968259 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7" (OuterVolumeSpecName: "kube-api-access-zfdl7") pod "08f29d07-f89c-4692-a2db-9499d9bd89a2" (UID: "08f29d07-f89c-4692-a2db-9499d9bd89a2"). InnerVolumeSpecName "kube-api-access-zfdl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.056173 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08f29d07-f89c-4692-a2db-9499d9bd89a2" (UID: "08f29d07-f89c-4692-a2db-9499d9bd89a2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.062536 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.062582 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08f29d07-f89c-4692-a2db-9499d9bd89a2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.062598 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfdl7\" (UniqueName: \"kubernetes.io/projected/08f29d07-f89c-4692-a2db-9499d9bd89a2-kube-api-access-zfdl7\") on node \"crc\" DevicePath \"\"" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.423352 4795 generic.go:334] "Generic (PLEG): container finished" podID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerID="c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e" exitCode=0 Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.423419 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerDied","Data":"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e"} Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.423460 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-g24k9" event={"ID":"08f29d07-f89c-4692-a2db-9499d9bd89a2","Type":"ContainerDied","Data":"b6b86507025161d65c6a338cb95b523a554fc2e6a4442ed1b907e6786e17948e"} Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.423419 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-g24k9" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.423491 4795 scope.go:117] "RemoveContainer" containerID="c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.452225 4795 scope.go:117] "RemoveContainer" containerID="8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.474549 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.483348 4795 scope.go:117] "RemoveContainer" containerID="f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.484288 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-g24k9"] Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.540293 4795 scope.go:117] "RemoveContainer" containerID="c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e" Mar 10 15:52:20 crc kubenswrapper[4795]: E0310 15:52:20.540878 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e\": container with ID starting with c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e not found: ID does not exist" containerID="c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.540932 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e"} err="failed to get container status \"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e\": rpc error: code = NotFound desc = could not find container \"c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e\": container with ID starting with c11254830014a3ce16f53c8fa60e9b056e85c425c48df39bec93cd8a5e91253e not found: ID does not exist" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.540969 4795 scope.go:117] "RemoveContainer" containerID="8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb" Mar 10 15:52:20 crc kubenswrapper[4795]: E0310 15:52:20.541364 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb\": container with ID starting with 8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb not found: ID does not exist" containerID="8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.541392 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb"} err="failed to get container status \"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb\": rpc error: code = NotFound desc = could not find container \"8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb\": container with ID starting with 8f0e9399f6a15096ffd27d55feb4ec3939dcee64c106e2f76bb04f0c2594b7fb not found: ID does not exist" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.541413 4795 scope.go:117] "RemoveContainer" containerID="f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348" Mar 10 15:52:20 crc kubenswrapper[4795]: E0310 15:52:20.541719 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348\": container with ID starting with f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348 not found: ID does not exist" containerID="f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348" Mar 10 15:52:20 crc kubenswrapper[4795]: I0310 15:52:20.541740 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348"} err="failed to get container status \"f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348\": rpc error: code = NotFound desc = could not find container \"f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348\": container with ID starting with f235640efe6762fa356ca0904f3a00c62dceb16dcc0e7741aabdccc638cd1348 not found: ID does not exist" Mar 10 15:52:21 crc kubenswrapper[4795]: I0310 15:52:21.491006 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" path="/var/lib/kubelet/pods/08f29d07-f89c-4692-a2db-9499d9bd89a2/volumes" Mar 10 15:52:46 crc kubenswrapper[4795]: I0310 15:52:46.161721 4795 scope.go:117] "RemoveContainer" containerID="c9683c32df4e647bac039fd5aed5867ca019063f8d4a28caab16d3f75cc536a5" Mar 10 15:52:48 crc kubenswrapper[4795]: I0310 15:52:48.539494 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:52:48 crc kubenswrapper[4795]: I0310 15:52:48.539748 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.354301 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.355869 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="extract-content" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.355893 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="extract-content" Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.355933 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="extract-utilities" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.355942 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="extract-utilities" Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.355955 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.355962 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.355988 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.355996 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.356024 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="extract-utilities" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.356034 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="extract-utilities" Mar 10 15:53:04 crc kubenswrapper[4795]: E0310 15:53:04.356049 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="extract-content" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.356057 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="extract-content" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.356343 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f29d07-f89c-4692-a2db-9499d9bd89a2" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.356364 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4315d5-bdc1-4281-9f30-d77b9c31e17b" containerName="registry-server" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.358374 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.373844 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.461156 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.461218 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.461294 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4jt6\" (UniqueName: \"kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.563974 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.564038 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.564130 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4jt6\" (UniqueName: \"kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.564726 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.564848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.589397 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4jt6\" (UniqueName: \"kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6\") pod \"community-operators-zqrvn\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:04 crc kubenswrapper[4795]: I0310 15:53:04.680678 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:05 crc kubenswrapper[4795]: I0310 15:53:05.260143 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:05 crc kubenswrapper[4795]: I0310 15:53:05.846380 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerID="5b6d1f81b821cb50281d1f3a8b71ae636c0d7922a7d9ffad6f842cc4265dd4c8" exitCode=0 Mar 10 15:53:05 crc kubenswrapper[4795]: I0310 15:53:05.846428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerDied","Data":"5b6d1f81b821cb50281d1f3a8b71ae636c0d7922a7d9ffad6f842cc4265dd4c8"} Mar 10 15:53:05 crc kubenswrapper[4795]: I0310 15:53:05.846722 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerStarted","Data":"417e65e34f25d260fd7b322f38602827ff8437fe04897f7126e9af04e94e6eed"} Mar 10 15:53:05 crc kubenswrapper[4795]: I0310 15:53:05.848888 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 15:53:06 crc kubenswrapper[4795]: I0310 15:53:06.857216 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerStarted","Data":"5cf4d0571fa989d6717e5db2a569d55b920405472e3f47e1c4cbe453e4467a78"} Mar 10 15:53:07 crc kubenswrapper[4795]: I0310 15:53:07.865991 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerID="5cf4d0571fa989d6717e5db2a569d55b920405472e3f47e1c4cbe453e4467a78" exitCode=0 Mar 10 15:53:07 crc kubenswrapper[4795]: I0310 15:53:07.866064 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerDied","Data":"5cf4d0571fa989d6717e5db2a569d55b920405472e3f47e1c4cbe453e4467a78"} Mar 10 15:53:08 crc kubenswrapper[4795]: I0310 15:53:08.876233 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerStarted","Data":"f931964dc542807a61d88da93527d9f4fc103528c9a32a0bd2e5d9a0c9d3edbd"} Mar 10 15:53:08 crc kubenswrapper[4795]: I0310 15:53:08.895415 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zqrvn" podStartSLOduration=2.497039425 podStartE2EDuration="4.895399266s" podCreationTimestamp="2026-03-10 15:53:04 +0000 UTC" firstStartedPulling="2026-03-10 15:53:05.848519175 +0000 UTC m=+2819.014260083" lastFinishedPulling="2026-03-10 15:53:08.246879026 +0000 UTC m=+2821.412619924" observedRunningTime="2026-03-10 15:53:08.893889663 +0000 UTC m=+2822.059630581" watchObservedRunningTime="2026-03-10 15:53:08.895399266 +0000 UTC m=+2822.061140164" Mar 10 15:53:14 crc kubenswrapper[4795]: I0310 15:53:14.681381 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:14 crc kubenswrapper[4795]: I0310 15:53:14.681985 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:14 crc kubenswrapper[4795]: I0310 15:53:14.736083 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:14 crc kubenswrapper[4795]: I0310 15:53:14.959593 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.125810 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.127863 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.152317 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.285265 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.285376 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcmpv\" (UniqueName: \"kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.285426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.387253 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.387345 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcmpv\" (UniqueName: \"kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.387380 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.387957 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.387966 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.409091 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcmpv\" (UniqueName: \"kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv\") pod \"certified-operators-t87rp\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.448894 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.927400 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.933865 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zqrvn" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="registry-server" containerID="cri-o://f931964dc542807a61d88da93527d9f4fc103528c9a32a0bd2e5d9a0c9d3edbd" gracePeriod=2 Mar 10 15:53:16 crc kubenswrapper[4795]: I0310 15:53:16.979730 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:17 crc kubenswrapper[4795]: I0310 15:53:17.948101 4795 generic.go:334] "Generic (PLEG): container finished" podID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerID="2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948" exitCode=0 Mar 10 15:53:17 crc kubenswrapper[4795]: I0310 15:53:17.948700 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerDied","Data":"2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948"} Mar 10 15:53:17 crc kubenswrapper[4795]: I0310 15:53:17.948734 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerStarted","Data":"4bcb1b00268847546a50ca98c5ca6e25fc255d96b65a7a66d8b24e38f33faa34"} Mar 10 15:53:17 crc kubenswrapper[4795]: I0310 15:53:17.956377 4795 generic.go:334] "Generic (PLEG): container finished" podID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerID="f931964dc542807a61d88da93527d9f4fc103528c9a32a0bd2e5d9a0c9d3edbd" exitCode=0 Mar 10 15:53:17 crc kubenswrapper[4795]: I0310 15:53:17.956415 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerDied","Data":"f931964dc542807a61d88da93527d9f4fc103528c9a32a0bd2e5d9a0c9d3edbd"} Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.035343 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.126166 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities\") pod \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.126217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content\") pod \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.126252 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4jt6\" (UniqueName: \"kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6\") pod \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\" (UID: \"5f94b2c8-efc6-47d6-8a99-90caef2e6606\") " Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.127870 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities" (OuterVolumeSpecName: "utilities") pod "5f94b2c8-efc6-47d6-8a99-90caef2e6606" (UID: "5f94b2c8-efc6-47d6-8a99-90caef2e6606"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.133136 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6" (OuterVolumeSpecName: "kube-api-access-f4jt6") pod "5f94b2c8-efc6-47d6-8a99-90caef2e6606" (UID: "5f94b2c8-efc6-47d6-8a99-90caef2e6606"). InnerVolumeSpecName "kube-api-access-f4jt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.182304 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5f94b2c8-efc6-47d6-8a99-90caef2e6606" (UID: "5f94b2c8-efc6-47d6-8a99-90caef2e6606"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.229025 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.229087 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5f94b2c8-efc6-47d6-8a99-90caef2e6606-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.229105 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4jt6\" (UniqueName: \"kubernetes.io/projected/5f94b2c8-efc6-47d6-8a99-90caef2e6606-kube-api-access-f4jt6\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.539326 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.539632 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.539681 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.540426 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.540497 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282" gracePeriod=600 Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.968456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerStarted","Data":"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460"} Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.976579 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zqrvn" event={"ID":"5f94b2c8-efc6-47d6-8a99-90caef2e6606","Type":"ContainerDied","Data":"417e65e34f25d260fd7b322f38602827ff8437fe04897f7126e9af04e94e6eed"} Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.976619 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zqrvn" Mar 10 15:53:18 crc kubenswrapper[4795]: I0310 15:53:18.976643 4795 scope.go:117] "RemoveContainer" containerID="f931964dc542807a61d88da93527d9f4fc103528c9a32a0bd2e5d9a0c9d3edbd" Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.004161 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282" exitCode=0 Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.004218 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282"} Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.004276 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471"} Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.018401 4795 scope.go:117] "RemoveContainer" containerID="5cf4d0571fa989d6717e5db2a569d55b920405472e3f47e1c4cbe453e4467a78" Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.053495 4795 scope.go:117] "RemoveContainer" containerID="5b6d1f81b821cb50281d1f3a8b71ae636c0d7922a7d9ffad6f842cc4265dd4c8" Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.066270 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.091262 4795 scope.go:117] "RemoveContainer" containerID="ca42bb155d8cc8bfd24e92b841eccfe89c012c928b78bfa6179541a424c6f917" Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.095722 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zqrvn"] Mar 10 15:53:19 crc kubenswrapper[4795]: E0310 15:53:19.194388 4795 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f94b2c8_efc6_47d6_8a99_90caef2e6606.slice/crio-417e65e34f25d260fd7b322f38602827ff8437fe04897f7126e9af04e94e6eed\": RecentStats: unable to find data in memory cache]" Mar 10 15:53:19 crc kubenswrapper[4795]: I0310 15:53:19.490017 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" path="/var/lib/kubelet/pods/5f94b2c8-efc6-47d6-8a99-90caef2e6606/volumes" Mar 10 15:53:22 crc kubenswrapper[4795]: I0310 15:53:22.035999 4795 generic.go:334] "Generic (PLEG): container finished" podID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerID="8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460" exitCode=0 Mar 10 15:53:22 crc kubenswrapper[4795]: I0310 15:53:22.036101 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerDied","Data":"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460"} Mar 10 15:53:23 crc kubenswrapper[4795]: I0310 15:53:23.048184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerStarted","Data":"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d"} Mar 10 15:53:23 crc kubenswrapper[4795]: I0310 15:53:23.066174 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t87rp" podStartSLOduration=2.563194921 podStartE2EDuration="7.066152671s" podCreationTimestamp="2026-03-10 15:53:16 +0000 UTC" firstStartedPulling="2026-03-10 15:53:17.951027096 +0000 UTC m=+2831.116768014" lastFinishedPulling="2026-03-10 15:53:22.453984866 +0000 UTC m=+2835.619725764" observedRunningTime="2026-03-10 15:53:23.065510162 +0000 UTC m=+2836.231251060" watchObservedRunningTime="2026-03-10 15:53:23.066152671 +0000 UTC m=+2836.231893569" Mar 10 15:53:26 crc kubenswrapper[4795]: I0310 15:53:26.449018 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:26 crc kubenswrapper[4795]: I0310 15:53:26.450383 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:26 crc kubenswrapper[4795]: I0310 15:53:26.497209 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:27 crc kubenswrapper[4795]: I0310 15:53:27.129638 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:29 crc kubenswrapper[4795]: I0310 15:53:29.518801 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.103900 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t87rp" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="registry-server" containerID="cri-o://499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d" gracePeriod=2 Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.664138 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.742649 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcmpv\" (UniqueName: \"kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv\") pod \"28d7935c-8eb6-49cc-96fb-882015615f8d\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.742776 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities\") pod \"28d7935c-8eb6-49cc-96fb-882015615f8d\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.742842 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content\") pod \"28d7935c-8eb6-49cc-96fb-882015615f8d\" (UID: \"28d7935c-8eb6-49cc-96fb-882015615f8d\") " Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.743877 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities" (OuterVolumeSpecName: "utilities") pod "28d7935c-8eb6-49cc-96fb-882015615f8d" (UID: "28d7935c-8eb6-49cc-96fb-882015615f8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.753240 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv" (OuterVolumeSpecName: "kube-api-access-zcmpv") pod "28d7935c-8eb6-49cc-96fb-882015615f8d" (UID: "28d7935c-8eb6-49cc-96fb-882015615f8d"). InnerVolumeSpecName "kube-api-access-zcmpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.804919 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28d7935c-8eb6-49cc-96fb-882015615f8d" (UID: "28d7935c-8eb6-49cc-96fb-882015615f8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.844730 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcmpv\" (UniqueName: \"kubernetes.io/projected/28d7935c-8eb6-49cc-96fb-882015615f8d-kube-api-access-zcmpv\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.844779 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:30 crc kubenswrapper[4795]: I0310 15:53:30.844794 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28d7935c-8eb6-49cc-96fb-882015615f8d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.133412 4795 generic.go:334] "Generic (PLEG): container finished" podID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerID="499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d" exitCode=0 Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.133481 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t87rp" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.133487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerDied","Data":"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d"} Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.133591 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t87rp" event={"ID":"28d7935c-8eb6-49cc-96fb-882015615f8d","Type":"ContainerDied","Data":"4bcb1b00268847546a50ca98c5ca6e25fc255d96b65a7a66d8b24e38f33faa34"} Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.133612 4795 scope.go:117] "RemoveContainer" containerID="499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.183031 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.189658 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t87rp"] Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.195560 4795 scope.go:117] "RemoveContainer" containerID="8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.227614 4795 scope.go:117] "RemoveContainer" containerID="2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.297362 4795 scope.go:117] "RemoveContainer" containerID="499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d" Mar 10 15:53:31 crc kubenswrapper[4795]: E0310 15:53:31.298802 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d\": container with ID starting with 499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d not found: ID does not exist" containerID="499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.298845 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d"} err="failed to get container status \"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d\": rpc error: code = NotFound desc = could not find container \"499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d\": container with ID starting with 499f6ea183e72d39c197b10f8e8ef9d13d603bf471e6851927b75a875f373d3d not found: ID does not exist" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.298873 4795 scope.go:117] "RemoveContainer" containerID="8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460" Mar 10 15:53:31 crc kubenswrapper[4795]: E0310 15:53:31.299421 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460\": container with ID starting with 8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460 not found: ID does not exist" containerID="8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.299461 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460"} err="failed to get container status \"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460\": rpc error: code = NotFound desc = could not find container \"8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460\": container with ID starting with 8bfac1e47176bb370a536faac4a87de33784ae4d10e5896288e2ceb30730d460 not found: ID does not exist" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.299481 4795 scope.go:117] "RemoveContainer" containerID="2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948" Mar 10 15:53:31 crc kubenswrapper[4795]: E0310 15:53:31.300190 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948\": container with ID starting with 2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948 not found: ID does not exist" containerID="2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.300238 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948"} err="failed to get container status \"2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948\": rpc error: code = NotFound desc = could not find container \"2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948\": container with ID starting with 2e0c4be65baf6d66e527aad667c95a872bf22071132527496dca92c2603c5948 not found: ID does not exist" Mar 10 15:53:31 crc kubenswrapper[4795]: I0310 15:53:31.487741 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" path="/var/lib/kubelet/pods/28d7935c-8eb6-49cc-96fb-882015615f8d/volumes" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.148920 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552634-jjk7g"] Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.149979 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="extract-utilities" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.149997 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="extract-utilities" Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.150029 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="extract-content" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150038 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="extract-content" Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.150059 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="extract-content" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150177 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="extract-content" Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.150219 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150228 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.150240 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="extract-utilities" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150248 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="extract-utilities" Mar 10 15:54:00 crc kubenswrapper[4795]: E0310 15:54:00.150258 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150266 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150495 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="28d7935c-8eb6-49cc-96fb-882015615f8d" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.150508 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f94b2c8-efc6-47d6-8a99-90caef2e6606" containerName="registry-server" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.151344 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.153141 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.153450 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.153916 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.169653 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-jjk7g"] Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.308158 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f42c\" (UniqueName: \"kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c\") pod \"auto-csr-approver-29552634-jjk7g\" (UID: \"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600\") " pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.411141 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f42c\" (UniqueName: \"kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c\") pod \"auto-csr-approver-29552634-jjk7g\" (UID: \"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600\") " pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.431619 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f42c\" (UniqueName: \"kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c\") pod \"auto-csr-approver-29552634-jjk7g\" (UID: \"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600\") " pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.473145 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:00 crc kubenswrapper[4795]: I0310 15:54:00.931509 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-jjk7g"] Mar 10 15:54:01 crc kubenswrapper[4795]: I0310 15:54:01.397902 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" event={"ID":"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600","Type":"ContainerStarted","Data":"0f9f0a2d8bd096089c491b9dbf03d3c53d3348ff417462110657937e614ce5b0"} Mar 10 15:54:02 crc kubenswrapper[4795]: I0310 15:54:02.410677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" event={"ID":"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600","Type":"ContainerStarted","Data":"509e248740ede16847697140e704145b4054aadc9eb1f6e94a0f67f1f0f208b9"} Mar 10 15:54:02 crc kubenswrapper[4795]: I0310 15:54:02.429110 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" podStartSLOduration=1.45502526 podStartE2EDuration="2.429042087s" podCreationTimestamp="2026-03-10 15:54:00 +0000 UTC" firstStartedPulling="2026-03-10 15:54:00.937038643 +0000 UTC m=+2874.102779551" lastFinishedPulling="2026-03-10 15:54:01.91105548 +0000 UTC m=+2875.076796378" observedRunningTime="2026-03-10 15:54:02.425359802 +0000 UTC m=+2875.591100720" watchObservedRunningTime="2026-03-10 15:54:02.429042087 +0000 UTC m=+2875.594782985" Mar 10 15:54:03 crc kubenswrapper[4795]: I0310 15:54:03.418187 4795 generic.go:334] "Generic (PLEG): container finished" podID="2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" containerID="509e248740ede16847697140e704145b4054aadc9eb1f6e94a0f67f1f0f208b9" exitCode=0 Mar 10 15:54:03 crc kubenswrapper[4795]: I0310 15:54:03.418326 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" event={"ID":"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600","Type":"ContainerDied","Data":"509e248740ede16847697140e704145b4054aadc9eb1f6e94a0f67f1f0f208b9"} Mar 10 15:54:04 crc kubenswrapper[4795]: I0310 15:54:04.812603 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:04 crc kubenswrapper[4795]: I0310 15:54:04.893376 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f42c\" (UniqueName: \"kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c\") pod \"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600\" (UID: \"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600\") " Mar 10 15:54:04 crc kubenswrapper[4795]: I0310 15:54:04.900301 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c" (OuterVolumeSpecName: "kube-api-access-5f42c") pod "2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" (UID: "2ca06d3f-8fdf-43bd-85d8-6111fa7d2600"). InnerVolumeSpecName "kube-api-access-5f42c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:54:04 crc kubenswrapper[4795]: I0310 15:54:04.995894 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f42c\" (UniqueName: \"kubernetes.io/projected/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600-kube-api-access-5f42c\") on node \"crc\" DevicePath \"\"" Mar 10 15:54:05 crc kubenswrapper[4795]: I0310 15:54:05.439645 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" event={"ID":"2ca06d3f-8fdf-43bd-85d8-6111fa7d2600","Type":"ContainerDied","Data":"0f9f0a2d8bd096089c491b9dbf03d3c53d3348ff417462110657937e614ce5b0"} Mar 10 15:54:05 crc kubenswrapper[4795]: I0310 15:54:05.440238 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f9f0a2d8bd096089c491b9dbf03d3c53d3348ff417462110657937e614ce5b0" Mar 10 15:54:05 crc kubenswrapper[4795]: I0310 15:54:05.439758 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552634-jjk7g" Mar 10 15:54:05 crc kubenswrapper[4795]: I0310 15:54:05.510277 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-z9q5g"] Mar 10 15:54:05 crc kubenswrapper[4795]: I0310 15:54:05.519596 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552628-z9q5g"] Mar 10 15:54:07 crc kubenswrapper[4795]: I0310 15:54:07.489601 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="622392be-1c48-4efe-922c-d816093d9798" path="/var/lib/kubelet/pods/622392be-1c48-4efe-922c-d816093d9798/volumes" Mar 10 15:54:46 crc kubenswrapper[4795]: I0310 15:54:46.334166 4795 scope.go:117] "RemoveContainer" containerID="0945381c8190a3b6b80b5a52f8c0770a68bbf2b83241137520aacb4f333832da" Mar 10 15:55:18 crc kubenswrapper[4795]: I0310 15:55:18.539874 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:55:18 crc kubenswrapper[4795]: I0310 15:55:18.540472 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:55:48 crc kubenswrapper[4795]: I0310 15:55:48.539032 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:55:48 crc kubenswrapper[4795]: I0310 15:55:48.539665 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.148349 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552636-55sts"] Mar 10 15:56:00 crc kubenswrapper[4795]: E0310 15:56:00.149230 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.149244 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.149446 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" containerName="oc" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.150172 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.152540 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.152791 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.154029 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.172388 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-55sts"] Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.251781 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2v29\" (UniqueName: \"kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29\") pod \"auto-csr-approver-29552636-55sts\" (UID: \"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe\") " pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.353888 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2v29\" (UniqueName: \"kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29\") pod \"auto-csr-approver-29552636-55sts\" (UID: \"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe\") " pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.373180 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2v29\" (UniqueName: \"kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29\") pod \"auto-csr-approver-29552636-55sts\" (UID: \"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe\") " pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.472324 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:00 crc kubenswrapper[4795]: I0310 15:56:00.918785 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-55sts"] Mar 10 15:56:01 crc kubenswrapper[4795]: I0310 15:56:01.443846 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-55sts" event={"ID":"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe","Type":"ContainerStarted","Data":"031b4dcf8d704e4ce00d8680176f3566b0d00534562a86fc1478af92ab379bc9"} Mar 10 15:56:02 crc kubenswrapper[4795]: I0310 15:56:02.454078 4795 generic.go:334] "Generic (PLEG): container finished" podID="04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" containerID="c9a4ba929ba6aa77692c4d3a05831fe27246dbd44a2e99395b3ff653a79ba001" exitCode=0 Mar 10 15:56:02 crc kubenswrapper[4795]: I0310 15:56:02.454209 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-55sts" event={"ID":"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe","Type":"ContainerDied","Data":"c9a4ba929ba6aa77692c4d3a05831fe27246dbd44a2e99395b3ff653a79ba001"} Mar 10 15:56:03 crc kubenswrapper[4795]: I0310 15:56:03.824389 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:03 crc kubenswrapper[4795]: I0310 15:56:03.925209 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2v29\" (UniqueName: \"kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29\") pod \"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe\" (UID: \"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe\") " Mar 10 15:56:03 crc kubenswrapper[4795]: I0310 15:56:03.930778 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29" (OuterVolumeSpecName: "kube-api-access-q2v29") pod "04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" (UID: "04e15cfe-d3c4-485e-b008-44eb9e9cc7fe"). InnerVolumeSpecName "kube-api-access-q2v29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.027546 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2v29\" (UniqueName: \"kubernetes.io/projected/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe-kube-api-access-q2v29\") on node \"crc\" DevicePath \"\"" Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.476963 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552636-55sts" event={"ID":"04e15cfe-d3c4-485e-b008-44eb9e9cc7fe","Type":"ContainerDied","Data":"031b4dcf8d704e4ce00d8680176f3566b0d00534562a86fc1478af92ab379bc9"} Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.477337 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="031b4dcf8d704e4ce00d8680176f3566b0d00534562a86fc1478af92ab379bc9" Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.477045 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552636-55sts" Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.896945 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-zxksk"] Mar 10 15:56:04 crc kubenswrapper[4795]: I0310 15:56:04.905326 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552630-zxksk"] Mar 10 15:56:05 crc kubenswrapper[4795]: I0310 15:56:05.493574 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51b4df05-e41d-4693-bb6c-eee112f59a9c" path="/var/lib/kubelet/pods/51b4df05-e41d-4693-bb6c-eee112f59a9c/volumes" Mar 10 15:56:18 crc kubenswrapper[4795]: I0310 15:56:18.539473 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 15:56:18 crc kubenswrapper[4795]: I0310 15:56:18.540119 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 15:56:18 crc kubenswrapper[4795]: I0310 15:56:18.540170 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 15:56:18 crc kubenswrapper[4795]: I0310 15:56:18.540938 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 15:56:18 crc kubenswrapper[4795]: I0310 15:56:18.541013 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" gracePeriod=600 Mar 10 15:56:18 crc kubenswrapper[4795]: E0310 15:56:18.695818 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:56:19 crc kubenswrapper[4795]: I0310 15:56:19.624330 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" exitCode=0 Mar 10 15:56:19 crc kubenswrapper[4795]: I0310 15:56:19.624404 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471"} Mar 10 15:56:19 crc kubenswrapper[4795]: I0310 15:56:19.624446 4795 scope.go:117] "RemoveContainer" containerID="ab3cd416f082ee58c9a3add55369408d322f82e82f9edf3db6a8eb0e02bae282" Mar 10 15:56:19 crc kubenswrapper[4795]: I0310 15:56:19.625254 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:56:19 crc kubenswrapper[4795]: E0310 15:56:19.625568 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:56:34 crc kubenswrapper[4795]: I0310 15:56:34.477720 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:56:34 crc kubenswrapper[4795]: E0310 15:56:34.478866 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:56:46 crc kubenswrapper[4795]: I0310 15:56:46.439256 4795 scope.go:117] "RemoveContainer" containerID="3372817b4fff5806bd02648d21494a95d71bc765e20d1a9227f23e7084b8b696" Mar 10 15:56:49 crc kubenswrapper[4795]: I0310 15:56:49.477156 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:56:49 crc kubenswrapper[4795]: E0310 15:56:49.478034 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:57:02 crc kubenswrapper[4795]: I0310 15:57:02.477560 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:57:02 crc kubenswrapper[4795]: E0310 15:57:02.478468 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:57:14 crc kubenswrapper[4795]: I0310 15:57:14.477519 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:57:14 crc kubenswrapper[4795]: E0310 15:57:14.478723 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:57:29 crc kubenswrapper[4795]: I0310 15:57:29.476591 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:57:29 crc kubenswrapper[4795]: E0310 15:57:29.477316 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:57:44 crc kubenswrapper[4795]: I0310 15:57:44.476836 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:57:44 crc kubenswrapper[4795]: E0310 15:57:44.477707 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:57:59 crc kubenswrapper[4795]: I0310 15:57:59.476987 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:57:59 crc kubenswrapper[4795]: E0310 15:57:59.477741 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.147320 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552638-dwl48"] Mar 10 15:58:00 crc kubenswrapper[4795]: E0310 15:58:00.148199 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.148813 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.149235 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" containerName="oc" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.150121 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.153491 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.153529 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.153705 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.154475 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-dwl48"] Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.251896 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgn4x\" (UniqueName: \"kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x\") pod \"auto-csr-approver-29552638-dwl48\" (UID: \"31a5da95-c50d-4e22-9629-f744cfcfa649\") " pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.353983 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgn4x\" (UniqueName: \"kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x\") pod \"auto-csr-approver-29552638-dwl48\" (UID: \"31a5da95-c50d-4e22-9629-f744cfcfa649\") " pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.376365 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgn4x\" (UniqueName: \"kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x\") pod \"auto-csr-approver-29552638-dwl48\" (UID: \"31a5da95-c50d-4e22-9629-f744cfcfa649\") " pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.471103 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:00 crc kubenswrapper[4795]: I0310 15:58:00.945504 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-dwl48"] Mar 10 15:58:01 crc kubenswrapper[4795]: I0310 15:58:01.553556 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-dwl48" event={"ID":"31a5da95-c50d-4e22-9629-f744cfcfa649","Type":"ContainerStarted","Data":"ae21984e68c768c7d028c1304f388ce67493679563feda69ab832755def98ed9"} Mar 10 15:58:02 crc kubenswrapper[4795]: I0310 15:58:02.564150 4795 generic.go:334] "Generic (PLEG): container finished" podID="31a5da95-c50d-4e22-9629-f744cfcfa649" containerID="ef92c24147795359f51faf18bc3dd9841f47cea48899249a5faba8d4a66279f2" exitCode=0 Mar 10 15:58:02 crc kubenswrapper[4795]: I0310 15:58:02.564467 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-dwl48" event={"ID":"31a5da95-c50d-4e22-9629-f744cfcfa649","Type":"ContainerDied","Data":"ef92c24147795359f51faf18bc3dd9841f47cea48899249a5faba8d4a66279f2"} Mar 10 15:58:03 crc kubenswrapper[4795]: I0310 15:58:03.948018 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.030769 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgn4x\" (UniqueName: \"kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x\") pod \"31a5da95-c50d-4e22-9629-f744cfcfa649\" (UID: \"31a5da95-c50d-4e22-9629-f744cfcfa649\") " Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.039938 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x" (OuterVolumeSpecName: "kube-api-access-tgn4x") pod "31a5da95-c50d-4e22-9629-f744cfcfa649" (UID: "31a5da95-c50d-4e22-9629-f744cfcfa649"). InnerVolumeSpecName "kube-api-access-tgn4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.133951 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgn4x\" (UniqueName: \"kubernetes.io/projected/31a5da95-c50d-4e22-9629-f744cfcfa649-kube-api-access-tgn4x\") on node \"crc\" DevicePath \"\"" Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.585824 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552638-dwl48" event={"ID":"31a5da95-c50d-4e22-9629-f744cfcfa649","Type":"ContainerDied","Data":"ae21984e68c768c7d028c1304f388ce67493679563feda69ab832755def98ed9"} Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.586188 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae21984e68c768c7d028c1304f388ce67493679563feda69ab832755def98ed9" Mar 10 15:58:04 crc kubenswrapper[4795]: I0310 15:58:04.585867 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552638-dwl48" Mar 10 15:58:05 crc kubenswrapper[4795]: I0310 15:58:05.033102 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xl7lb"] Mar 10 15:58:05 crc kubenswrapper[4795]: I0310 15:58:05.046833 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552632-xl7lb"] Mar 10 15:58:05 crc kubenswrapper[4795]: I0310 15:58:05.491252 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39dce396-78f7-40b5-b4db-abe0bbc524c7" path="/var/lib/kubelet/pods/39dce396-78f7-40b5-b4db-abe0bbc524c7/volumes" Mar 10 15:58:12 crc kubenswrapper[4795]: I0310 15:58:12.476395 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:58:12 crc kubenswrapper[4795]: E0310 15:58:12.477183 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:58:25 crc kubenswrapper[4795]: I0310 15:58:25.477266 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:58:25 crc kubenswrapper[4795]: E0310 15:58:25.478128 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:58:36 crc kubenswrapper[4795]: I0310 15:58:36.475961 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:58:36 crc kubenswrapper[4795]: E0310 15:58:36.476808 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:58:46 crc kubenswrapper[4795]: I0310 15:58:46.529619 4795 scope.go:117] "RemoveContainer" containerID="0fac467e277e0fe3b940d02fa772a8c4630a4d90d114cf740f7372ab4914b0ca" Mar 10 15:58:50 crc kubenswrapper[4795]: I0310 15:58:50.476684 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:58:50 crc kubenswrapper[4795]: E0310 15:58:50.479677 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:59:03 crc kubenswrapper[4795]: I0310 15:59:03.476583 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:59:03 crc kubenswrapper[4795]: E0310 15:59:03.477281 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:59:16 crc kubenswrapper[4795]: I0310 15:59:16.476829 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:59:16 crc kubenswrapper[4795]: E0310 15:59:16.477755 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:59:30 crc kubenswrapper[4795]: I0310 15:59:30.476855 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:59:30 crc kubenswrapper[4795]: E0310 15:59:30.477719 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:59:45 crc kubenswrapper[4795]: I0310 15:59:45.477355 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:59:45 crc kubenswrapper[4795]: E0310 15:59:45.478024 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 15:59:58 crc kubenswrapper[4795]: I0310 15:59:58.479024 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 15:59:58 crc kubenswrapper[4795]: E0310 15:59:58.480672 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.163957 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552640-bgmfb"] Mar 10 16:00:00 crc kubenswrapper[4795]: E0310 16:00:00.164627 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a5da95-c50d-4e22-9629-f744cfcfa649" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.164641 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a5da95-c50d-4e22-9629-f744cfcfa649" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.164904 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a5da95-c50d-4e22-9629-f744cfcfa649" containerName="oc" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.165523 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.168488 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.168541 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.176232 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-bgmfb"] Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.183364 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.248112 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj"] Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.249701 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.251912 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.252020 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.257954 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj"] Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.354262 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw42s\" (UniqueName: \"kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s\") pod \"auto-csr-approver-29552640-bgmfb\" (UID: \"909e2760-8a48-4c97-8a05-caa0e928ee7c\") " pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.354583 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.354621 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.354683 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr6vt\" (UniqueName: \"kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.455812 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw42s\" (UniqueName: \"kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s\") pod \"auto-csr-approver-29552640-bgmfb\" (UID: \"909e2760-8a48-4c97-8a05-caa0e928ee7c\") " pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.455857 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.455887 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.455962 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr6vt\" (UniqueName: \"kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.457037 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.463530 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.473265 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr6vt\" (UniqueName: \"kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt\") pod \"collect-profiles-29552640-8jtlj\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.484313 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw42s\" (UniqueName: \"kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s\") pod \"auto-csr-approver-29552640-bgmfb\" (UID: \"909e2760-8a48-4c97-8a05-caa0e928ee7c\") " pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.486936 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.578507 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.976990 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-bgmfb"] Mar 10 16:00:00 crc kubenswrapper[4795]: I0310 16:00:00.979284 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:00:01 crc kubenswrapper[4795]: W0310 16:00:01.126874 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c6090cf_962e_4fbc_95fd_5e829a4fc209.slice/crio-f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860 WatchSource:0}: Error finding container f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860: Status 404 returned error can't find the container with id f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860 Mar 10 16:00:01 crc kubenswrapper[4795]: I0310 16:00:01.127564 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj"] Mar 10 16:00:01 crc kubenswrapper[4795]: I0310 16:00:01.796091 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" event={"ID":"909e2760-8a48-4c97-8a05-caa0e928ee7c","Type":"ContainerStarted","Data":"1952091e6f8cc46bb4d557b7066ab6553a6fd4aef0cfbc1750fe8d20826048de"} Mar 10 16:00:01 crc kubenswrapper[4795]: I0310 16:00:01.797782 4795 generic.go:334] "Generic (PLEG): container finished" podID="8c6090cf-962e-4fbc-95fd-5e829a4fc209" containerID="f97929f8cb56054b29b7082adc429467e27f8b27a941837509afc697f918c6fb" exitCode=0 Mar 10 16:00:01 crc kubenswrapper[4795]: I0310 16:00:01.797836 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" event={"ID":"8c6090cf-962e-4fbc-95fd-5e829a4fc209","Type":"ContainerDied","Data":"f97929f8cb56054b29b7082adc429467e27f8b27a941837509afc697f918c6fb"} Mar 10 16:00:01 crc kubenswrapper[4795]: I0310 16:00:01.797860 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" event={"ID":"8c6090cf-962e-4fbc-95fd-5e829a4fc209","Type":"ContainerStarted","Data":"f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860"} Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.181669 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.218719 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume\") pod \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.218987 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr6vt\" (UniqueName: \"kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt\") pod \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.219348 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume\") pod \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\" (UID: \"8c6090cf-962e-4fbc-95fd-5e829a4fc209\") " Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.219770 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume" (OuterVolumeSpecName: "config-volume") pod "8c6090cf-962e-4fbc-95fd-5e829a4fc209" (UID: "8c6090cf-962e-4fbc-95fd-5e829a4fc209"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.219965 4795 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8c6090cf-962e-4fbc-95fd-5e829a4fc209-config-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.227131 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt" (OuterVolumeSpecName: "kube-api-access-hr6vt") pod "8c6090cf-962e-4fbc-95fd-5e829a4fc209" (UID: "8c6090cf-962e-4fbc-95fd-5e829a4fc209"). InnerVolumeSpecName "kube-api-access-hr6vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.227332 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "8c6090cf-962e-4fbc-95fd-5e829a4fc209" (UID: "8c6090cf-962e-4fbc-95fd-5e829a4fc209"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.321380 4795 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/8c6090cf-962e-4fbc-95fd-5e829a4fc209-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.321407 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr6vt\" (UniqueName: \"kubernetes.io/projected/8c6090cf-962e-4fbc-95fd-5e829a4fc209-kube-api-access-hr6vt\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.814611 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" event={"ID":"8c6090cf-962e-4fbc-95fd-5e829a4fc209","Type":"ContainerDied","Data":"f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860"} Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.814664 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81f3e9bc9656c2c6ed5c17e902262f709ea9d52809ba3e7082e3fc89b49f860" Mar 10 16:00:03 crc kubenswrapper[4795]: I0310 16:00:03.814667 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29552640-8jtlj" Mar 10 16:00:04 crc kubenswrapper[4795]: I0310 16:00:04.253431 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t"] Mar 10 16:00:04 crc kubenswrapper[4795]: I0310 16:00:04.261353 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29552595-lbf4t"] Mar 10 16:00:05 crc kubenswrapper[4795]: I0310 16:00:05.491093 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cd1574-f40a-4b09-b79a-8bd20a4d9698" path="/var/lib/kubelet/pods/34cd1574-f40a-4b09-b79a-8bd20a4d9698/volumes" Mar 10 16:00:09 crc kubenswrapper[4795]: I0310 16:00:09.476826 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:00:09 crc kubenswrapper[4795]: E0310 16:00:09.477537 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:00:10 crc kubenswrapper[4795]: I0310 16:00:10.879457 4795 generic.go:334] "Generic (PLEG): container finished" podID="909e2760-8a48-4c97-8a05-caa0e928ee7c" containerID="b669ebe71e808658b91eb86b7a11019b79c48f37fcab8d4a209a74b50e68f8dd" exitCode=0 Mar 10 16:00:10 crc kubenswrapper[4795]: I0310 16:00:10.879541 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" event={"ID":"909e2760-8a48-4c97-8a05-caa0e928ee7c","Type":"ContainerDied","Data":"b669ebe71e808658b91eb86b7a11019b79c48f37fcab8d4a209a74b50e68f8dd"} Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.256841 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.406621 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw42s\" (UniqueName: \"kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s\") pod \"909e2760-8a48-4c97-8a05-caa0e928ee7c\" (UID: \"909e2760-8a48-4c97-8a05-caa0e928ee7c\") " Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.418275 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s" (OuterVolumeSpecName: "kube-api-access-pw42s") pod "909e2760-8a48-4c97-8a05-caa0e928ee7c" (UID: "909e2760-8a48-4c97-8a05-caa0e928ee7c"). InnerVolumeSpecName "kube-api-access-pw42s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.509317 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw42s\" (UniqueName: \"kubernetes.io/projected/909e2760-8a48-4c97-8a05-caa0e928ee7c-kube-api-access-pw42s\") on node \"crc\" DevicePath \"\"" Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.896311 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" event={"ID":"909e2760-8a48-4c97-8a05-caa0e928ee7c","Type":"ContainerDied","Data":"1952091e6f8cc46bb4d557b7066ab6553a6fd4aef0cfbc1750fe8d20826048de"} Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.896396 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552640-bgmfb" Mar 10 16:00:12 crc kubenswrapper[4795]: I0310 16:00:12.896408 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1952091e6f8cc46bb4d557b7066ab6553a6fd4aef0cfbc1750fe8d20826048de" Mar 10 16:00:13 crc kubenswrapper[4795]: I0310 16:00:13.331373 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-jjk7g"] Mar 10 16:00:13 crc kubenswrapper[4795]: I0310 16:00:13.338825 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552634-jjk7g"] Mar 10 16:00:13 crc kubenswrapper[4795]: I0310 16:00:13.500264 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ca06d3f-8fdf-43bd-85d8-6111fa7d2600" path="/var/lib/kubelet/pods/2ca06d3f-8fdf-43bd-85d8-6111fa7d2600/volumes" Mar 10 16:00:22 crc kubenswrapper[4795]: I0310 16:00:22.476460 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:00:22 crc kubenswrapper[4795]: E0310 16:00:22.477292 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:00:33 crc kubenswrapper[4795]: I0310 16:00:33.476560 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:00:33 crc kubenswrapper[4795]: E0310 16:00:33.477288 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:00:45 crc kubenswrapper[4795]: I0310 16:00:45.477078 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:00:45 crc kubenswrapper[4795]: E0310 16:00:45.478832 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:00:46 crc kubenswrapper[4795]: I0310 16:00:46.632821 4795 scope.go:117] "RemoveContainer" containerID="509e248740ede16847697140e704145b4054aadc9eb1f6e94a0f67f1f0f208b9" Mar 10 16:00:46 crc kubenswrapper[4795]: I0310 16:00:46.676344 4795 scope.go:117] "RemoveContainer" containerID="50a6ad604734c081e6c96830706a5a9f07eb3cb3311c9e768206e61cf4876ea3" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.150174 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29552641-682xt"] Mar 10 16:01:00 crc kubenswrapper[4795]: E0310 16:01:00.151196 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6090cf-962e-4fbc-95fd-5e829a4fc209" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.151213 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6090cf-962e-4fbc-95fd-5e829a4fc209" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4795]: E0310 16:01:00.151248 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="909e2760-8a48-4c97-8a05-caa0e928ee7c" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.151256 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="909e2760-8a48-4c97-8a05-caa0e928ee7c" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.151456 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="909e2760-8a48-4c97-8a05-caa0e928ee7c" containerName="oc" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.151487 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6090cf-962e-4fbc-95fd-5e829a4fc209" containerName="collect-profiles" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.152622 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.167733 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552641-682xt"] Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.332284 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.332426 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.332492 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22nss\" (UniqueName: \"kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.332564 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.434263 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.434318 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22nss\" (UniqueName: \"kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.434366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.434432 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.440094 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.440249 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.441945 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.458833 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22nss\" (UniqueName: \"kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss\") pod \"keystone-cron-29552641-682xt\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.476223 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:01:00 crc kubenswrapper[4795]: E0310 16:01:00.476557 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.488554 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:00 crc kubenswrapper[4795]: I0310 16:01:00.931533 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29552641-682xt"] Mar 10 16:01:01 crc kubenswrapper[4795]: I0310 16:01:01.308749 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-682xt" event={"ID":"31964fe2-d4bb-4fe8-8046-c266914fe1b3","Type":"ContainerStarted","Data":"9fa21f337d54795773e9098a8c53183208b76b26160afe91892ee0e3662d17c0"} Mar 10 16:01:01 crc kubenswrapper[4795]: I0310 16:01:01.308800 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-682xt" event={"ID":"31964fe2-d4bb-4fe8-8046-c266914fe1b3","Type":"ContainerStarted","Data":"2af71ede7fa7bdcaaca969cded0fd20b72d461c150f6732dff42bd685b17023d"} Mar 10 16:01:01 crc kubenswrapper[4795]: I0310 16:01:01.328280 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29552641-682xt" podStartSLOduration=1.3282635489999999 podStartE2EDuration="1.328263549s" podCreationTimestamp="2026-03-10 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-10 16:01:01.325053858 +0000 UTC m=+3294.490794766" watchObservedRunningTime="2026-03-10 16:01:01.328263549 +0000 UTC m=+3294.494004447" Mar 10 16:01:04 crc kubenswrapper[4795]: I0310 16:01:04.333628 4795 generic.go:334] "Generic (PLEG): container finished" podID="31964fe2-d4bb-4fe8-8046-c266914fe1b3" containerID="9fa21f337d54795773e9098a8c53183208b76b26160afe91892ee0e3662d17c0" exitCode=0 Mar 10 16:01:04 crc kubenswrapper[4795]: I0310 16:01:04.333753 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-682xt" event={"ID":"31964fe2-d4bb-4fe8-8046-c266914fe1b3","Type":"ContainerDied","Data":"9fa21f337d54795773e9098a8c53183208b76b26160afe91892ee0e3662d17c0"} Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.767433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.938230 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys\") pod \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.938452 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22nss\" (UniqueName: \"kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss\") pod \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.938575 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle\") pod \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.938656 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data\") pod \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\" (UID: \"31964fe2-d4bb-4fe8-8046-c266914fe1b3\") " Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.944831 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "31964fe2-d4bb-4fe8-8046-c266914fe1b3" (UID: "31964fe2-d4bb-4fe8-8046-c266914fe1b3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.945254 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss" (OuterVolumeSpecName: "kube-api-access-22nss") pod "31964fe2-d4bb-4fe8-8046-c266914fe1b3" (UID: "31964fe2-d4bb-4fe8-8046-c266914fe1b3"). InnerVolumeSpecName "kube-api-access-22nss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.968806 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31964fe2-d4bb-4fe8-8046-c266914fe1b3" (UID: "31964fe2-d4bb-4fe8-8046-c266914fe1b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:05 crc kubenswrapper[4795]: I0310 16:01:05.991993 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data" (OuterVolumeSpecName: "config-data") pod "31964fe2-d4bb-4fe8-8046-c266914fe1b3" (UID: "31964fe2-d4bb-4fe8-8046-c266914fe1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.040633 4795 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.040670 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.040682 4795 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/31964fe2-d4bb-4fe8-8046-c266914fe1b3-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.040693 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22nss\" (UniqueName: \"kubernetes.io/projected/31964fe2-d4bb-4fe8-8046-c266914fe1b3-kube-api-access-22nss\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.352567 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29552641-682xt" event={"ID":"31964fe2-d4bb-4fe8-8046-c266914fe1b3","Type":"ContainerDied","Data":"2af71ede7fa7bdcaaca969cded0fd20b72d461c150f6732dff42bd685b17023d"} Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.352607 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af71ede7fa7bdcaaca969cded0fd20b72d461c150f6732dff42bd685b17023d" Mar 10 16:01:06 crc kubenswrapper[4795]: I0310 16:01:06.352671 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29552641-682xt" Mar 10 16:01:13 crc kubenswrapper[4795]: I0310 16:01:13.476586 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:01:13 crc kubenswrapper[4795]: E0310 16:01:13.478543 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.314484 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:20 crc kubenswrapper[4795]: E0310 16:01:20.315437 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31964fe2-d4bb-4fe8-8046-c266914fe1b3" containerName="keystone-cron" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.315452 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="31964fe2-d4bb-4fe8-8046-c266914fe1b3" containerName="keystone-cron" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.315643 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="31964fe2-d4bb-4fe8-8046-c266914fe1b3" containerName="keystone-cron" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.317030 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.341496 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.408549 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.408985 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84w5\" (UniqueName: \"kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.409211 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.511241 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d84w5\" (UniqueName: \"kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.511880 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.512106 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.512491 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.512658 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.532312 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84w5\" (UniqueName: \"kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5\") pod \"redhat-operators-k68m8\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:20 crc kubenswrapper[4795]: I0310 16:01:20.653533 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:21 crc kubenswrapper[4795]: I0310 16:01:21.136245 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:21 crc kubenswrapper[4795]: I0310 16:01:21.507862 4795 generic.go:334] "Generic (PLEG): container finished" podID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerID="eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff" exitCode=0 Mar 10 16:01:21 crc kubenswrapper[4795]: I0310 16:01:21.507942 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerDied","Data":"eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff"} Mar 10 16:01:21 crc kubenswrapper[4795]: I0310 16:01:21.508274 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerStarted","Data":"ffc3249b24a79b1503d74da0b5ce646555c86b66bffeee2c06c907961b205a1b"} Mar 10 16:01:23 crc kubenswrapper[4795]: I0310 16:01:23.525017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerStarted","Data":"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f"} Mar 10 16:01:25 crc kubenswrapper[4795]: I0310 16:01:25.547036 4795 generic.go:334] "Generic (PLEG): container finished" podID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerID="7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f" exitCode=0 Mar 10 16:01:25 crc kubenswrapper[4795]: I0310 16:01:25.547139 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerDied","Data":"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f"} Mar 10 16:01:26 crc kubenswrapper[4795]: I0310 16:01:26.564247 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerStarted","Data":"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628"} Mar 10 16:01:26 crc kubenswrapper[4795]: I0310 16:01:26.588656 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k68m8" podStartSLOduration=1.843297341 podStartE2EDuration="6.588637364s" podCreationTimestamp="2026-03-10 16:01:20 +0000 UTC" firstStartedPulling="2026-03-10 16:01:21.510188993 +0000 UTC m=+3314.675929891" lastFinishedPulling="2026-03-10 16:01:26.255528996 +0000 UTC m=+3319.421269914" observedRunningTime="2026-03-10 16:01:26.582812178 +0000 UTC m=+3319.748553086" watchObservedRunningTime="2026-03-10 16:01:26.588637364 +0000 UTC m=+3319.754378262" Mar 10 16:01:27 crc kubenswrapper[4795]: I0310 16:01:27.489498 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:01:28 crc kubenswrapper[4795]: I0310 16:01:28.586525 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac"} Mar 10 16:01:30 crc kubenswrapper[4795]: I0310 16:01:30.653671 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:30 crc kubenswrapper[4795]: I0310 16:01:30.654287 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:31 crc kubenswrapper[4795]: I0310 16:01:31.714205 4795 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k68m8" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="registry-server" probeResult="failure" output=< Mar 10 16:01:31 crc kubenswrapper[4795]: timeout: failed to connect service ":50051" within 1s Mar 10 16:01:31 crc kubenswrapper[4795]: > Mar 10 16:01:40 crc kubenswrapper[4795]: I0310 16:01:40.719039 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:40 crc kubenswrapper[4795]: I0310 16:01:40.787722 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:40 crc kubenswrapper[4795]: I0310 16:01:40.960605 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:42 crc kubenswrapper[4795]: I0310 16:01:42.699347 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k68m8" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="registry-server" containerID="cri-o://8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628" gracePeriod=2 Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.183147 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.315814 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities\") pod \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.315889 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d84w5\" (UniqueName: \"kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5\") pod \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.316013 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content\") pod \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\" (UID: \"1cabbc2e-dd73-4bc7-a904-1c2ef2952096\") " Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.318400 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities" (OuterVolumeSpecName: "utilities") pod "1cabbc2e-dd73-4bc7-a904-1c2ef2952096" (UID: "1cabbc2e-dd73-4bc7-a904-1c2ef2952096"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.323338 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5" (OuterVolumeSpecName: "kube-api-access-d84w5") pod "1cabbc2e-dd73-4bc7-a904-1c2ef2952096" (UID: "1cabbc2e-dd73-4bc7-a904-1c2ef2952096"). InnerVolumeSpecName "kube-api-access-d84w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.418445 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.418492 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d84w5\" (UniqueName: \"kubernetes.io/projected/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-kube-api-access-d84w5\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.489961 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cabbc2e-dd73-4bc7-a904-1c2ef2952096" (UID: "1cabbc2e-dd73-4bc7-a904-1c2ef2952096"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.520167 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cabbc2e-dd73-4bc7-a904-1c2ef2952096-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.710386 4795 generic.go:334] "Generic (PLEG): container finished" podID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerID="8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628" exitCode=0 Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.710425 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerDied","Data":"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628"} Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.710433 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k68m8" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.710453 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k68m8" event={"ID":"1cabbc2e-dd73-4bc7-a904-1c2ef2952096","Type":"ContainerDied","Data":"ffc3249b24a79b1503d74da0b5ce646555c86b66bffeee2c06c907961b205a1b"} Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.710473 4795 scope.go:117] "RemoveContainer" containerID="8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.747211 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.759238 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k68m8"] Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.760891 4795 scope.go:117] "RemoveContainer" containerID="7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.779674 4795 scope.go:117] "RemoveContainer" containerID="eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.822335 4795 scope.go:117] "RemoveContainer" containerID="8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628" Mar 10 16:01:43 crc kubenswrapper[4795]: E0310 16:01:43.822812 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628\": container with ID starting with 8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628 not found: ID does not exist" containerID="8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.822853 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628"} err="failed to get container status \"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628\": rpc error: code = NotFound desc = could not find container \"8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628\": container with ID starting with 8f95f092fa391952cbe55d09dc9c72ba5b8d730e2237002618f943a41199f628 not found: ID does not exist" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.822879 4795 scope.go:117] "RemoveContainer" containerID="7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f" Mar 10 16:01:43 crc kubenswrapper[4795]: E0310 16:01:43.823260 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f\": container with ID starting with 7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f not found: ID does not exist" containerID="7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.823287 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f"} err="failed to get container status \"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f\": rpc error: code = NotFound desc = could not find container \"7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f\": container with ID starting with 7468dbd1eaa5cc76f491305001f0f29af9e27d7f9c33e053ac585ac881f4084f not found: ID does not exist" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.823303 4795 scope.go:117] "RemoveContainer" containerID="eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff" Mar 10 16:01:43 crc kubenswrapper[4795]: E0310 16:01:43.823609 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff\": container with ID starting with eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff not found: ID does not exist" containerID="eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff" Mar 10 16:01:43 crc kubenswrapper[4795]: I0310 16:01:43.823636 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff"} err="failed to get container status \"eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff\": rpc error: code = NotFound desc = could not find container \"eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff\": container with ID starting with eb78e3ed226300b061805035b75e437d43890e0e9471e5187c5e3ccfa26806ff not found: ID does not exist" Mar 10 16:01:45 crc kubenswrapper[4795]: I0310 16:01:45.488258 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" path="/var/lib/kubelet/pods/1cabbc2e-dd73-4bc7-a904-1c2ef2952096/volumes" Mar 10 16:01:55 crc kubenswrapper[4795]: I0310 16:01:55.821588 4795 generic.go:334] "Generic (PLEG): container finished" podID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" containerID="a8ffb6484fa57ec3651bec64ae7bbcbd3811d920b206e8f56690a656761f884a" exitCode=0 Mar 10 16:01:55 crc kubenswrapper[4795]: I0310 16:01:55.821666 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"865ec795-fb93-4628-bdf5-5451ffbf2c0c","Type":"ContainerDied","Data":"a8ffb6484fa57ec3651bec64ae7bbcbd3811d920b206e8f56690a656761f884a"} Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.242700 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397217 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397275 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397327 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397409 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397512 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397574 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397641 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397722 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.397741 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpl64\" (UniqueName: \"kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64\") pod \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\" (UID: \"865ec795-fb93-4628-bdf5-5451ffbf2c0c\") " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.398535 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data" (OuterVolumeSpecName: "config-data") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.399683 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.402972 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64" (OuterVolumeSpecName: "kube-api-access-jpl64") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "kube-api-access-jpl64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.403356 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.404573 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "test-operator-logs") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.424746 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.426668 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.428794 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.448370 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "865ec795-fb93-4628-bdf5-5451ffbf2c0c" (UID: "865ec795-fb93-4628-bdf5-5451ffbf2c0c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500586 4795 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500625 4795 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-config-data\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500639 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500655 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500667 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpl64\" (UniqueName: \"kubernetes.io/projected/865ec795-fb93-4628-bdf5-5451ffbf2c0c-kube-api-access-jpl64\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500679 4795 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500691 4795 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/865ec795-fb93-4628-bdf5-5451ffbf2c0c-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500706 4795 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/865ec795-fb93-4628-bdf5-5451ffbf2c0c-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.500717 4795 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/865ec795-fb93-4628-bdf5-5451ffbf2c0c-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.519728 4795 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.602128 4795 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.846804 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"865ec795-fb93-4628-bdf5-5451ffbf2c0c","Type":"ContainerDied","Data":"8f81bc7e2b4ea553e06db5d4ce8851bc84e974f7851711438ff9818c0b6c0dfe"} Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.846840 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f81bc7e2b4ea553e06db5d4ce8851bc84e974f7851711438ff9818c0b6c0dfe" Mar 10 16:01:57 crc kubenswrapper[4795]: I0310 16:01:57.846877 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.164883 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552642-l6bbt"] Mar 10 16:02:00 crc kubenswrapper[4795]: E0310 16:02:00.165577 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="extract-content" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165589 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="extract-content" Mar 10 16:02:00 crc kubenswrapper[4795]: E0310 16:02:00.165602 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="extract-utilities" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165608 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="extract-utilities" Mar 10 16:02:00 crc kubenswrapper[4795]: E0310 16:02:00.165639 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:02:00 crc kubenswrapper[4795]: E0310 16:02:00.165663 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165670 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165857 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="865ec795-fb93-4628-bdf5-5451ffbf2c0c" containerName="tempest-tests-tempest-tests-runner" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.165876 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cabbc2e-dd73-4bc7-a904-1c2ef2952096" containerName="registry-server" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.166459 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.169277 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.169341 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.169674 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.174089 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-l6bbt"] Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.250637 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zx6n\" (UniqueName: \"kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n\") pod \"auto-csr-approver-29552642-l6bbt\" (UID: \"4780d693-bde7-430b-b1ed-4ff5c27f3b0a\") " pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.352303 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zx6n\" (UniqueName: \"kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n\") pod \"auto-csr-approver-29552642-l6bbt\" (UID: \"4780d693-bde7-430b-b1ed-4ff5c27f3b0a\") " pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.376335 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zx6n\" (UniqueName: \"kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n\") pod \"auto-csr-approver-29552642-l6bbt\" (UID: \"4780d693-bde7-430b-b1ed-4ff5c27f3b0a\") " pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.486130 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:00 crc kubenswrapper[4795]: I0310 16:02:00.926961 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-l6bbt"] Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.302300 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.303627 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.309343 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-rtrl5" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.312503 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.371182 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpzqq\" (UniqueName: \"kubernetes.io/projected/718d7d95-6ec8-4bb6-9014-f1a7bf22cdca-kube-api-access-vpzqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.371246 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.473373 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpzqq\" (UniqueName: \"kubernetes.io/projected/718d7d95-6ec8-4bb6-9014-f1a7bf22cdca-kube-api-access-vpzqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.473712 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.474207 4795 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.495163 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpzqq\" (UniqueName: \"kubernetes.io/projected/718d7d95-6ec8-4bb6-9014-f1a7bf22cdca-kube-api-access-vpzqq\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.499664 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.667417 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 10 16:02:01 crc kubenswrapper[4795]: I0310 16:02:01.897918 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" event={"ID":"4780d693-bde7-430b-b1ed-4ff5c27f3b0a","Type":"ContainerStarted","Data":"afec4ea76d09008d856d8fc651a13ac44230bc39be784be6510779432f0e7589"} Mar 10 16:02:02 crc kubenswrapper[4795]: I0310 16:02:02.104906 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 10 16:02:02 crc kubenswrapper[4795]: I0310 16:02:02.909348 4795 generic.go:334] "Generic (PLEG): container finished" podID="4780d693-bde7-430b-b1ed-4ff5c27f3b0a" containerID="ec16fb270fe3a7b7e0b1d3efac6801254a3ab63a09b53ce2ef79eac76979f684" exitCode=0 Mar 10 16:02:02 crc kubenswrapper[4795]: I0310 16:02:02.909407 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" event={"ID":"4780d693-bde7-430b-b1ed-4ff5c27f3b0a","Type":"ContainerDied","Data":"ec16fb270fe3a7b7e0b1d3efac6801254a3ab63a09b53ce2ef79eac76979f684"} Mar 10 16:02:02 crc kubenswrapper[4795]: I0310 16:02:02.913456 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca","Type":"ContainerStarted","Data":"cf659270ae7817a56a1c2eed9fd54309da21337f121255c2ea2e2613e9030788"} Mar 10 16:02:03 crc kubenswrapper[4795]: I0310 16:02:03.932144 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"718d7d95-6ec8-4bb6-9014-f1a7bf22cdca","Type":"ContainerStarted","Data":"1574a1b6c0d3b1308f9a37748d65ecea049ba64aceb5f1137ef179baeed242ce"} Mar 10 16:02:03 crc kubenswrapper[4795]: I0310 16:02:03.950773 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.866318069 podStartE2EDuration="2.950750033s" podCreationTimestamp="2026-03-10 16:02:01 +0000 UTC" firstStartedPulling="2026-03-10 16:02:02.109976251 +0000 UTC m=+3355.275717159" lastFinishedPulling="2026-03-10 16:02:03.194408235 +0000 UTC m=+3356.360149123" observedRunningTime="2026-03-10 16:02:03.944182466 +0000 UTC m=+3357.109923374" watchObservedRunningTime="2026-03-10 16:02:03.950750033 +0000 UTC m=+3357.116490951" Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.280019 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.325502 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zx6n\" (UniqueName: \"kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n\") pod \"4780d693-bde7-430b-b1ed-4ff5c27f3b0a\" (UID: \"4780d693-bde7-430b-b1ed-4ff5c27f3b0a\") " Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.333414 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n" (OuterVolumeSpecName: "kube-api-access-9zx6n") pod "4780d693-bde7-430b-b1ed-4ff5c27f3b0a" (UID: "4780d693-bde7-430b-b1ed-4ff5c27f3b0a"). InnerVolumeSpecName "kube-api-access-9zx6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.427699 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zx6n\" (UniqueName: \"kubernetes.io/projected/4780d693-bde7-430b-b1ed-4ff5c27f3b0a-kube-api-access-9zx6n\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.947806 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.948273 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552642-l6bbt" event={"ID":"4780d693-bde7-430b-b1ed-4ff5c27f3b0a","Type":"ContainerDied","Data":"afec4ea76d09008d856d8fc651a13ac44230bc39be784be6510779432f0e7589"} Mar 10 16:02:04 crc kubenswrapper[4795]: I0310 16:02:04.948330 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afec4ea76d09008d856d8fc651a13ac44230bc39be784be6510779432f0e7589" Mar 10 16:02:05 crc kubenswrapper[4795]: I0310 16:02:05.345407 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-55sts"] Mar 10 16:02:05 crc kubenswrapper[4795]: I0310 16:02:05.354529 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552636-55sts"] Mar 10 16:02:05 crc kubenswrapper[4795]: I0310 16:02:05.490513 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e15cfe-d3c4-485e-b008-44eb9e9cc7fe" path="/var/lib/kubelet/pods/04e15cfe-d3c4-485e-b008-44eb9e9cc7fe/volumes" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.648576 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:30 crc kubenswrapper[4795]: E0310 16:02:30.649702 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4780d693-bde7-430b-b1ed-4ff5c27f3b0a" containerName="oc" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.649723 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4780d693-bde7-430b-b1ed-4ff5c27f3b0a" containerName="oc" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.650025 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4780d693-bde7-430b-b1ed-4ff5c27f3b0a" containerName="oc" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.651971 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.661000 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.734228 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.734312 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4tn9\" (UniqueName: \"kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.734389 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.836114 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.836513 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.836626 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4tn9\" (UniqueName: \"kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.836673 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.836914 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.861034 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4tn9\" (UniqueName: \"kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9\") pod \"redhat-marketplace-fkzlx\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:30 crc kubenswrapper[4795]: I0310 16:02:30.972721 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:31 crc kubenswrapper[4795]: I0310 16:02:31.436561 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:32 crc kubenswrapper[4795]: I0310 16:02:32.203226 4795 generic.go:334] "Generic (PLEG): container finished" podID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerID="d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6" exitCode=0 Mar 10 16:02:32 crc kubenswrapper[4795]: I0310 16:02:32.203305 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerDied","Data":"d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6"} Mar 10 16:02:32 crc kubenswrapper[4795]: I0310 16:02:32.203569 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerStarted","Data":"9ae7a93d74628ffad506e5d2cc5f3883a1a94347f6ba9f5e841b01229f9ecfe7"} Mar 10 16:02:34 crc kubenswrapper[4795]: I0310 16:02:34.219835 4795 generic.go:334] "Generic (PLEG): container finished" podID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerID="f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2" exitCode=0 Mar 10 16:02:34 crc kubenswrapper[4795]: I0310 16:02:34.220012 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerDied","Data":"f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2"} Mar 10 16:02:35 crc kubenswrapper[4795]: I0310 16:02:35.231977 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerStarted","Data":"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48"} Mar 10 16:02:35 crc kubenswrapper[4795]: I0310 16:02:35.257475 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fkzlx" podStartSLOduration=2.773241211 podStartE2EDuration="5.257447412s" podCreationTimestamp="2026-03-10 16:02:30 +0000 UTC" firstStartedPulling="2026-03-10 16:02:32.205275634 +0000 UTC m=+3385.371016532" lastFinishedPulling="2026-03-10 16:02:34.689481835 +0000 UTC m=+3387.855222733" observedRunningTime="2026-03-10 16:02:35.253404447 +0000 UTC m=+3388.419145335" watchObservedRunningTime="2026-03-10 16:02:35.257447412 +0000 UTC m=+3388.423188310" Mar 10 16:02:40 crc kubenswrapper[4795]: I0310 16:02:40.973743 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:40 crc kubenswrapper[4795]: I0310 16:02:40.974389 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.046421 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.324534 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.369266 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.755880 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7sdzb/must-gather-9ghfg"] Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.757369 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.759742 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7sdzb"/"openshift-service-ca.crt" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.759837 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-7sdzb"/"kube-root-ca.crt" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.769059 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-7sdzb"/"default-dockercfg-fqnrj" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.776847 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7sdzb/must-gather-9ghfg"] Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.855785 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5v8p\" (UniqueName: \"kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.855936 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.957464 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5v8p\" (UniqueName: \"kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.957628 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.958129 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:41 crc kubenswrapper[4795]: I0310 16:02:41.979848 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5v8p\" (UniqueName: \"kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p\") pod \"must-gather-9ghfg\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:42 crc kubenswrapper[4795]: I0310 16:02:42.077609 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:02:42 crc kubenswrapper[4795]: I0310 16:02:42.620346 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7sdzb/must-gather-9ghfg"] Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.305553 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" event={"ID":"7a1c9fc1-975f-44c7-95b4-a731abb4a742","Type":"ContainerStarted","Data":"a0877653abe097a1f38d76725c61ac473efa36a5c909dfd5a3adb433ac4d77e9"} Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.305742 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fkzlx" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="registry-server" containerID="cri-o://dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48" gracePeriod=2 Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.780539 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.889610 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities\") pod \"d9358f85-b7bd-4aed-873a-12ed437891d4\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.889749 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content\") pod \"d9358f85-b7bd-4aed-873a-12ed437891d4\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.889837 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4tn9\" (UniqueName: \"kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9\") pod \"d9358f85-b7bd-4aed-873a-12ed437891d4\" (UID: \"d9358f85-b7bd-4aed-873a-12ed437891d4\") " Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.897133 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9" (OuterVolumeSpecName: "kube-api-access-k4tn9") pod "d9358f85-b7bd-4aed-873a-12ed437891d4" (UID: "d9358f85-b7bd-4aed-873a-12ed437891d4"). InnerVolumeSpecName "kube-api-access-k4tn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.898060 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities" (OuterVolumeSpecName: "utilities") pod "d9358f85-b7bd-4aed-873a-12ed437891d4" (UID: "d9358f85-b7bd-4aed-873a-12ed437891d4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.992504 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:43 crc kubenswrapper[4795]: I0310 16:02:43.992539 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4tn9\" (UniqueName: \"kubernetes.io/projected/d9358f85-b7bd-4aed-873a-12ed437891d4-kube-api-access-k4tn9\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.125970 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9358f85-b7bd-4aed-873a-12ed437891d4" (UID: "d9358f85-b7bd-4aed-873a-12ed437891d4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.195495 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9358f85-b7bd-4aed-873a-12ed437891d4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.314982 4795 generic.go:334] "Generic (PLEG): container finished" podID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerID="dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48" exitCode=0 Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.315136 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerDied","Data":"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48"} Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.316495 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fkzlx" event={"ID":"d9358f85-b7bd-4aed-873a-12ed437891d4","Type":"ContainerDied","Data":"9ae7a93d74628ffad506e5d2cc5f3883a1a94347f6ba9f5e841b01229f9ecfe7"} Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.316580 4795 scope.go:117] "RemoveContainer" containerID="dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.315226 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fkzlx" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.342659 4795 scope.go:117] "RemoveContainer" containerID="f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.351640 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.359954 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fkzlx"] Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.373278 4795 scope.go:117] "RemoveContainer" containerID="d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.409363 4795 scope.go:117] "RemoveContainer" containerID="dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48" Mar 10 16:02:44 crc kubenswrapper[4795]: E0310 16:02:44.417211 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48\": container with ID starting with dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48 not found: ID does not exist" containerID="dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.417339 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48"} err="failed to get container status \"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48\": rpc error: code = NotFound desc = could not find container \"dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48\": container with ID starting with dac5f396c7ea8c53ac3722f5e6514a84db41642b95f8fb5d4906a31b79750f48 not found: ID does not exist" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.417414 4795 scope.go:117] "RemoveContainer" containerID="f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2" Mar 10 16:02:44 crc kubenswrapper[4795]: E0310 16:02:44.418619 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2\": container with ID starting with f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2 not found: ID does not exist" containerID="f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.418725 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2"} err="failed to get container status \"f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2\": rpc error: code = NotFound desc = could not find container \"f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2\": container with ID starting with f13c2cb875a25016d3162d70d6f9c9b5529b1eb75eee457a43ae2e8cca9de6e2 not found: ID does not exist" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.418794 4795 scope.go:117] "RemoveContainer" containerID="d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6" Mar 10 16:02:44 crc kubenswrapper[4795]: E0310 16:02:44.419055 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6\": container with ID starting with d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6 not found: ID does not exist" containerID="d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6" Mar 10 16:02:44 crc kubenswrapper[4795]: I0310 16:02:44.419143 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6"} err="failed to get container status \"d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6\": rpc error: code = NotFound desc = could not find container \"d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6\": container with ID starting with d822354e015a3c64dc0a21ae9d7abab0418d42b4a44af9a176b4a8361efb68c6 not found: ID does not exist" Mar 10 16:02:45 crc kubenswrapper[4795]: I0310 16:02:45.492809 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" path="/var/lib/kubelet/pods/d9358f85-b7bd-4aed-873a-12ed437891d4/volumes" Mar 10 16:02:46 crc kubenswrapper[4795]: I0310 16:02:46.822652 4795 scope.go:117] "RemoveContainer" containerID="c9a4ba929ba6aa77692c4d3a05831fe27246dbd44a2e99395b3ff653a79ba001" Mar 10 16:02:50 crc kubenswrapper[4795]: I0310 16:02:50.377308 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" event={"ID":"7a1c9fc1-975f-44c7-95b4-a731abb4a742","Type":"ContainerStarted","Data":"df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96"} Mar 10 16:02:50 crc kubenswrapper[4795]: I0310 16:02:50.377851 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" event={"ID":"7a1c9fc1-975f-44c7-95b4-a731abb4a742","Type":"ContainerStarted","Data":"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3"} Mar 10 16:02:50 crc kubenswrapper[4795]: I0310 16:02:50.396167 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" podStartSLOduration=2.73833311 podStartE2EDuration="9.396150329s" podCreationTimestamp="2026-03-10 16:02:41 +0000 UTC" firstStartedPulling="2026-03-10 16:02:42.667835602 +0000 UTC m=+3395.833576500" lastFinishedPulling="2026-03-10 16:02:49.325652811 +0000 UTC m=+3402.491393719" observedRunningTime="2026-03-10 16:02:50.394970015 +0000 UTC m=+3403.560710913" watchObservedRunningTime="2026-03-10 16:02:50.396150329 +0000 UTC m=+3403.561891227" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.886502 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-f8j9w"] Mar 10 16:02:52 crc kubenswrapper[4795]: E0310 16:02:52.887416 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="extract-utilities" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.887432 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="extract-utilities" Mar 10 16:02:52 crc kubenswrapper[4795]: E0310 16:02:52.887458 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="extract-content" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.887464 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="extract-content" Mar 10 16:02:52 crc kubenswrapper[4795]: E0310 16:02:52.887483 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="registry-server" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.887491 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="registry-server" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.887730 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9358f85-b7bd-4aed-873a-12ed437891d4" containerName="registry-server" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.888351 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.964622 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:52 crc kubenswrapper[4795]: I0310 16:02:52.964763 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq9dd\" (UniqueName: \"kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.067266 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.067369 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq9dd\" (UniqueName: \"kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.067746 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.086669 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq9dd\" (UniqueName: \"kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd\") pod \"crc-debug-f8j9w\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.214233 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:02:53 crc kubenswrapper[4795]: I0310 16:02:53.406965 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" event={"ID":"1ee9c800-27ee-456a-9138-0d42ce9a9f11","Type":"ContainerStarted","Data":"5ad9cf50e55211959cbcdd66daa5b22a65327815bdadde270a0e5967c39424b7"} Mar 10 16:03:04 crc kubenswrapper[4795]: I0310 16:03:04.502310 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" event={"ID":"1ee9c800-27ee-456a-9138-0d42ce9a9f11","Type":"ContainerStarted","Data":"80ccd3c6d64c89a8863960a925a914c471f2ad26a8e7240cf0f55c860b43b96c"} Mar 10 16:03:04 crc kubenswrapper[4795]: I0310 16:03:04.523363 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" podStartSLOduration=1.6602438290000001 podStartE2EDuration="12.523346891s" podCreationTimestamp="2026-03-10 16:02:52 +0000 UTC" firstStartedPulling="2026-03-10 16:02:53.254921681 +0000 UTC m=+3406.420662579" lastFinishedPulling="2026-03-10 16:03:04.118024743 +0000 UTC m=+3417.283765641" observedRunningTime="2026-03-10 16:03:04.513778579 +0000 UTC m=+3417.679519477" watchObservedRunningTime="2026-03-10 16:03:04.523346891 +0000 UTC m=+3417.689087789" Mar 10 16:03:43 crc kubenswrapper[4795]: I0310 16:03:43.839531 4795 generic.go:334] "Generic (PLEG): container finished" podID="1ee9c800-27ee-456a-9138-0d42ce9a9f11" containerID="80ccd3c6d64c89a8863960a925a914c471f2ad26a8e7240cf0f55c860b43b96c" exitCode=0 Mar 10 16:03:43 crc kubenswrapper[4795]: I0310 16:03:43.839601 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" event={"ID":"1ee9c800-27ee-456a-9138-0d42ce9a9f11","Type":"ContainerDied","Data":"80ccd3c6d64c89a8863960a925a914c471f2ad26a8e7240cf0f55c860b43b96c"} Mar 10 16:03:44 crc kubenswrapper[4795]: I0310 16:03:44.951381 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:03:44 crc kubenswrapper[4795]: I0310 16:03:44.993252 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-f8j9w"] Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.000186 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-f8j9w"] Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.058866 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host\") pod \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.059006 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host" (OuterVolumeSpecName: "host") pod "1ee9c800-27ee-456a-9138-0d42ce9a9f11" (UID: "1ee9c800-27ee-456a-9138-0d42ce9a9f11"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.059320 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq9dd\" (UniqueName: \"kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd\") pod \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\" (UID: \"1ee9c800-27ee-456a-9138-0d42ce9a9f11\") " Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.059903 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1ee9c800-27ee-456a-9138-0d42ce9a9f11-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.071318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd" (OuterVolumeSpecName: "kube-api-access-tq9dd") pod "1ee9c800-27ee-456a-9138-0d42ce9a9f11" (UID: "1ee9c800-27ee-456a-9138-0d42ce9a9f11"). InnerVolumeSpecName "kube-api-access-tq9dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.161929 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq9dd\" (UniqueName: \"kubernetes.io/projected/1ee9c800-27ee-456a-9138-0d42ce9a9f11-kube-api-access-tq9dd\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.491113 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ee9c800-27ee-456a-9138-0d42ce9a9f11" path="/var/lib/kubelet/pods/1ee9c800-27ee-456a-9138-0d42ce9a9f11/volumes" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.856199 4795 scope.go:117] "RemoveContainer" containerID="80ccd3c6d64c89a8863960a925a914c471f2ad26a8e7240cf0f55c860b43b96c" Mar 10 16:03:45 crc kubenswrapper[4795]: I0310 16:03:45.856227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-f8j9w" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.188656 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-2lf4k"] Mar 10 16:03:46 crc kubenswrapper[4795]: E0310 16:03:46.189104 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ee9c800-27ee-456a-9138-0d42ce9a9f11" containerName="container-00" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.189119 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ee9c800-27ee-456a-9138-0d42ce9a9f11" containerName="container-00" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.189356 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ee9c800-27ee-456a-9138-0d42ce9a9f11" containerName="container-00" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.190226 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.281339 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.281717 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prhr\" (UniqueName: \"kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.383905 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.384019 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prhr\" (UniqueName: \"kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.384061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.416686 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prhr\" (UniqueName: \"kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr\") pod \"crc-debug-2lf4k\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.505357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.871904 4795 generic.go:334] "Generic (PLEG): container finished" podID="a2f574cb-fbd5-4328-be40-85a012c16abe" containerID="052f228cbb6da042dad4bdc3bd42781be20393e8d8e38df742bdc0e0820d45cf" exitCode=0 Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.872017 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" event={"ID":"a2f574cb-fbd5-4328-be40-85a012c16abe","Type":"ContainerDied","Data":"052f228cbb6da042dad4bdc3bd42781be20393e8d8e38df742bdc0e0820d45cf"} Mar 10 16:03:46 crc kubenswrapper[4795]: I0310 16:03:46.872059 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" event={"ID":"a2f574cb-fbd5-4328-be40-85a012c16abe","Type":"ContainerStarted","Data":"d1c6542a90b5e294b02b8a704bb79a2fd47958699defa50d2756ed1456c94426"} Mar 10 16:03:47 crc kubenswrapper[4795]: I0310 16:03:47.343611 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-2lf4k"] Mar 10 16:03:47 crc kubenswrapper[4795]: I0310 16:03:47.351340 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-2lf4k"] Mar 10 16:03:47 crc kubenswrapper[4795]: I0310 16:03:47.983622 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.114322 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host\") pod \"a2f574cb-fbd5-4328-be40-85a012c16abe\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.114477 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prhr\" (UniqueName: \"kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr\") pod \"a2f574cb-fbd5-4328-be40-85a012c16abe\" (UID: \"a2f574cb-fbd5-4328-be40-85a012c16abe\") " Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.114548 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host" (OuterVolumeSpecName: "host") pod "a2f574cb-fbd5-4328-be40-85a012c16abe" (UID: "a2f574cb-fbd5-4328-be40-85a012c16abe"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.115016 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a2f574cb-fbd5-4328-be40-85a012c16abe-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.125318 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr" (OuterVolumeSpecName: "kube-api-access-7prhr") pod "a2f574cb-fbd5-4328-be40-85a012c16abe" (UID: "a2f574cb-fbd5-4328-be40-85a012c16abe"). InnerVolumeSpecName "kube-api-access-7prhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.216664 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prhr\" (UniqueName: \"kubernetes.io/projected/a2f574cb-fbd5-4328-be40-85a012c16abe-kube-api-access-7prhr\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.525266 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-zktwr"] Mar 10 16:03:48 crc kubenswrapper[4795]: E0310 16:03:48.525650 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2f574cb-fbd5-4328-be40-85a012c16abe" containerName="container-00" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.525663 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2f574cb-fbd5-4328-be40-85a012c16abe" containerName="container-00" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.525916 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2f574cb-fbd5-4328-be40-85a012c16abe" containerName="container-00" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.526649 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.544575 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.544619 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.622875 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnkr\" (UniqueName: \"kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.623317 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.724699 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.724842 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnkr\" (UniqueName: \"kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.724936 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.746542 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnkr\" (UniqueName: \"kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr\") pod \"crc-debug-zktwr\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.848345 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:48 crc kubenswrapper[4795]: W0310 16:03:48.873009 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a1eb185_08dc_4e6c_8958_146607097196.slice/crio-712cb7e4d152badeaf0d5765409fb93ced4ff73fd526ad37d6c9b17461c76281 WatchSource:0}: Error finding container 712cb7e4d152badeaf0d5765409fb93ced4ff73fd526ad37d6c9b17461c76281: Status 404 returned error can't find the container with id 712cb7e4d152badeaf0d5765409fb93ced4ff73fd526ad37d6c9b17461c76281 Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.897186 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-2lf4k" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.897185 4795 scope.go:117] "RemoveContainer" containerID="052f228cbb6da042dad4bdc3bd42781be20393e8d8e38df742bdc0e0820d45cf" Mar 10 16:03:48 crc kubenswrapper[4795]: I0310 16:03:48.898786 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" event={"ID":"8a1eb185-08dc-4e6c-8958-146607097196","Type":"ContainerStarted","Data":"712cb7e4d152badeaf0d5765409fb93ced4ff73fd526ad37d6c9b17461c76281"} Mar 10 16:03:49 crc kubenswrapper[4795]: I0310 16:03:49.488654 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2f574cb-fbd5-4328-be40-85a012c16abe" path="/var/lib/kubelet/pods/a2f574cb-fbd5-4328-be40-85a012c16abe/volumes" Mar 10 16:03:49 crc kubenswrapper[4795]: I0310 16:03:49.913306 4795 generic.go:334] "Generic (PLEG): container finished" podID="8a1eb185-08dc-4e6c-8958-146607097196" containerID="082c9cc4236b7130d562f97cca35808c6a4a469fbe43a562985d4e89a18fe0e4" exitCode=0 Mar 10 16:03:49 crc kubenswrapper[4795]: I0310 16:03:49.913350 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" event={"ID":"8a1eb185-08dc-4e6c-8958-146607097196","Type":"ContainerDied","Data":"082c9cc4236b7130d562f97cca35808c6a4a469fbe43a562985d4e89a18fe0e4"} Mar 10 16:03:49 crc kubenswrapper[4795]: I0310 16:03:49.951620 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-zktwr"] Mar 10 16:03:49 crc kubenswrapper[4795]: I0310 16:03:49.959453 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7sdzb/crc-debug-zktwr"] Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.047227 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.171554 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lnkr\" (UniqueName: \"kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr\") pod \"8a1eb185-08dc-4e6c-8958-146607097196\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.171806 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host\") pod \"8a1eb185-08dc-4e6c-8958-146607097196\" (UID: \"8a1eb185-08dc-4e6c-8958-146607097196\") " Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.171953 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host" (OuterVolumeSpecName: "host") pod "8a1eb185-08dc-4e6c-8958-146607097196" (UID: "8a1eb185-08dc-4e6c-8958-146607097196"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.172249 4795 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8a1eb185-08dc-4e6c-8958-146607097196-host\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.177758 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr" (OuterVolumeSpecName: "kube-api-access-8lnkr") pod "8a1eb185-08dc-4e6c-8958-146607097196" (UID: "8a1eb185-08dc-4e6c-8958-146607097196"). InnerVolumeSpecName "kube-api-access-8lnkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.274325 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lnkr\" (UniqueName: \"kubernetes.io/projected/8a1eb185-08dc-4e6c-8958-146607097196-kube-api-access-8lnkr\") on node \"crc\" DevicePath \"\"" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.488790 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a1eb185-08dc-4e6c-8958-146607097196" path="/var/lib/kubelet/pods/8a1eb185-08dc-4e6c-8958-146607097196/volumes" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.929439 4795 scope.go:117] "RemoveContainer" containerID="082c9cc4236b7130d562f97cca35808c6a4a469fbe43a562985d4e89a18fe0e4" Mar 10 16:03:51 crc kubenswrapper[4795]: I0310 16:03:51.929507 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/crc-debug-zktwr" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.340855 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:03:55 crc kubenswrapper[4795]: E0310 16:03:55.341759 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a1eb185-08dc-4e6c-8958-146607097196" containerName="container-00" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.341775 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a1eb185-08dc-4e6c-8958-146607097196" containerName="container-00" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.342011 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a1eb185-08dc-4e6c-8958-146607097196" containerName="container-00" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.343357 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.355337 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.447545 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hbxx\" (UniqueName: \"kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.447660 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.447704 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.549707 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.550326 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.549787 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.550428 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.550823 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hbxx\" (UniqueName: \"kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.578051 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hbxx\" (UniqueName: \"kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx\") pod \"community-operators-92b92\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:55 crc kubenswrapper[4795]: I0310 16:03:55.680252 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:03:56 crc kubenswrapper[4795]: I0310 16:03:56.300765 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:03:57 crc kubenswrapper[4795]: I0310 16:03:57.011169 4795 generic.go:334] "Generic (PLEG): container finished" podID="3782d74f-790d-47fb-8d34-27ef970b3231" containerID="d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd" exitCode=0 Mar 10 16:03:57 crc kubenswrapper[4795]: I0310 16:03:57.011313 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerDied","Data":"d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd"} Mar 10 16:03:57 crc kubenswrapper[4795]: I0310 16:03:57.011532 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerStarted","Data":"e4f6457fd4aa051c2d287203aa4e690040e6ca3bf42355a8e9a99415c2ba0ca7"} Mar 10 16:03:58 crc kubenswrapper[4795]: I0310 16:03:58.021856 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerStarted","Data":"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd"} Mar 10 16:03:59 crc kubenswrapper[4795]: I0310 16:03:59.043166 4795 generic.go:334] "Generic (PLEG): container finished" podID="3782d74f-790d-47fb-8d34-27ef970b3231" containerID="8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd" exitCode=0 Mar 10 16:03:59 crc kubenswrapper[4795]: I0310 16:03:59.043421 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerDied","Data":"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd"} Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.056106 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerStarted","Data":"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d"} Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.089197 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-92b92" podStartSLOduration=2.5993671689999998 podStartE2EDuration="5.089174371s" podCreationTimestamp="2026-03-10 16:03:55 +0000 UTC" firstStartedPulling="2026-03-10 16:03:57.012989427 +0000 UTC m=+3470.178730325" lastFinishedPulling="2026-03-10 16:03:59.502796629 +0000 UTC m=+3472.668537527" observedRunningTime="2026-03-10 16:04:00.085688702 +0000 UTC m=+3473.251429610" watchObservedRunningTime="2026-03-10 16:04:00.089174371 +0000 UTC m=+3473.254915289" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.140787 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552644-g6vgh"] Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.142731 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.150954 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.151271 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.151607 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.151739 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-g6vgh"] Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.337535 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf68z\" (UniqueName: \"kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z\") pod \"auto-csr-approver-29552644-g6vgh\" (UID: \"420efd3a-8ad6-4262-821d-17a19ce0c8ae\") " pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.439230 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf68z\" (UniqueName: \"kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z\") pod \"auto-csr-approver-29552644-g6vgh\" (UID: \"420efd3a-8ad6-4262-821d-17a19ce0c8ae\") " pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.463680 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf68z\" (UniqueName: \"kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z\") pod \"auto-csr-approver-29552644-g6vgh\" (UID: \"420efd3a-8ad6-4262-821d-17a19ce0c8ae\") " pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:00 crc kubenswrapper[4795]: I0310 16:04:00.763809 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:01 crc kubenswrapper[4795]: I0310 16:04:01.215488 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-g6vgh"] Mar 10 16:04:01 crc kubenswrapper[4795]: W0310 16:04:01.224406 4795 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod420efd3a_8ad6_4262_821d_17a19ce0c8ae.slice/crio-9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817 WatchSource:0}: Error finding container 9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817: Status 404 returned error can't find the container with id 9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817 Mar 10 16:04:02 crc kubenswrapper[4795]: I0310 16:04:02.072600 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" event={"ID":"420efd3a-8ad6-4262-821d-17a19ce0c8ae","Type":"ContainerStarted","Data":"9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817"} Mar 10 16:04:03 crc kubenswrapper[4795]: I0310 16:04:03.096351 4795 generic.go:334] "Generic (PLEG): container finished" podID="420efd3a-8ad6-4262-821d-17a19ce0c8ae" containerID="a680b6bb33ff0415a2633ce8f9989cf2d313a1644cb9c31b808e6f996eafaf87" exitCode=0 Mar 10 16:04:03 crc kubenswrapper[4795]: I0310 16:04:03.096625 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" event={"ID":"420efd3a-8ad6-4262-821d-17a19ce0c8ae","Type":"ContainerDied","Data":"a680b6bb33ff0415a2633ce8f9989cf2d313a1644cb9c31b808e6f996eafaf87"} Mar 10 16:04:04 crc kubenswrapper[4795]: I0310 16:04:04.516581 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:04 crc kubenswrapper[4795]: I0310 16:04:04.615980 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf68z\" (UniqueName: \"kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z\") pod \"420efd3a-8ad6-4262-821d-17a19ce0c8ae\" (UID: \"420efd3a-8ad6-4262-821d-17a19ce0c8ae\") " Mar 10 16:04:04 crc kubenswrapper[4795]: I0310 16:04:04.633016 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z" (OuterVolumeSpecName: "kube-api-access-qf68z") pod "420efd3a-8ad6-4262-821d-17a19ce0c8ae" (UID: "420efd3a-8ad6-4262-821d-17a19ce0c8ae"). InnerVolumeSpecName "kube-api-access-qf68z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:04 crc kubenswrapper[4795]: I0310 16:04:04.718974 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf68z\" (UniqueName: \"kubernetes.io/projected/420efd3a-8ad6-4262-821d-17a19ce0c8ae-kube-api-access-qf68z\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.117647 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" event={"ID":"420efd3a-8ad6-4262-821d-17a19ce0c8ae","Type":"ContainerDied","Data":"9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817"} Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.117693 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a6a480421a16f9462c7eed22658308afb378783711202bb0abb1593f4169817" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.117750 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552644-g6vgh" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.542724 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b68874cc4-rzntx_0940a851-b873-4348-b89d-6ca90cf8646f/barbican-api/0.log" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.588465 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-dwl48"] Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.598504 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552638-dwl48"] Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.681096 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.681332 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.730963 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.738698 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-b68874cc4-rzntx_0940a851-b873-4348-b89d-6ca90cf8646f/barbican-api-log/0.log" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.780426 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67b995dbd6-dgj6m_bd0afebe-9b88-450e-88e1-641870206db5/barbican-keystone-listener/0.log" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.847001 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-67b995dbd6-dgj6m_bd0afebe-9b88-450e-88e1-641870206db5/barbican-keystone-listener-log/0.log" Mar 10 16:04:05 crc kubenswrapper[4795]: I0310 16:04:05.983156 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76476484b5-x6dzm_0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f/barbican-worker/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.009524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-76476484b5-x6dzm_0c4057cd-aaf0-4dd9-a0a2-e6aa3044076f/barbican-worker-log/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.174402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-h8sq6_74f728b8-775b-45f0-90f9-8e4d8e77e5bc/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.185435 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.234575 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7066ac26-5bcb-472a-ba10-c8e08af7f0b3/ceilometer-central-agent/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.258038 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.279308 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7066ac26-5bcb-472a-ba10-c8e08af7f0b3/ceilometer-notification-agent/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.401200 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7066ac26-5bcb-472a-ba10-c8e08af7f0b3/proxy-httpd/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.433537 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_7066ac26-5bcb-472a-ba10-c8e08af7f0b3/sg-core/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.480092 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eaa71521-8502-4baa-a81f-7c8147ffd6a5/cinder-api/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.592130 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_eaa71521-8502-4baa-a81f-7c8147ffd6a5/cinder-api-log/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.687342 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd6c655a-bae7-4234-9d8a-b585e74a75e6/probe/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.724333 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_fd6c655a-bae7-4234-9d8a-b585e74a75e6/cinder-scheduler/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.905701 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-j8sqm_40ff419c-9d5a-4d92-8bfe-40edd38f79ba/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:06 crc kubenswrapper[4795]: I0310 16:04:06.954110 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-8t9kt_9b45b478-8da9-468e-94d0-c6eca284ef60/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.086333 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4k6tc_888ca29c-cfea-4f70-8f7b-f0539e3df18b/init/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.267709 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4k6tc_888ca29c-cfea-4f70-8f7b-f0539e3df18b/init/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.310790 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-d992r_0d84f498-2364-4d50-8dfa-c49547c2e29a/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.344144 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-4k6tc_888ca29c-cfea-4f70-8f7b-f0539e3df18b/dnsmasq-dns/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.471563 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f01cb9f-0169-4daa-b31f-0ab1e38d96ce/glance-httpd/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.499959 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a5da95-c50d-4e22-9629-f744cfcfa649" path="/var/lib/kubelet/pods/31a5da95-c50d-4e22-9629-f744cfcfa649/volumes" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.503805 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_4f01cb9f-0169-4daa-b31f-0ab1e38d96ce/glance-log/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.665996 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15823ffe-ed18-451a-95e7-30ebee5218f3/glance-httpd/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.715023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_15823ffe-ed18-451a-95e7-30ebee5218f3/glance-log/0.log" Mar 10 16:04:07 crc kubenswrapper[4795]: I0310 16:04:07.853659 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5869d54dfb-2wjww_379541ea-de81-488c-b6dc-2f5873fdfbeb/horizon/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.044828 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-stfzl_e6878552-5eb7-470b-a482-f1be0b632858/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.125023 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5869d54dfb-2wjww_379541ea-de81-488c-b6dc-2f5873fdfbeb/horizon-log/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.140836 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-92b92" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="registry-server" containerID="cri-o://9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d" gracePeriod=2 Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.231463 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-snr99_f7025f5a-4547-4d24-8e11-bb54a9ff4311/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.514339 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29552641-682xt_31964fe2-d4bb-4fe8-8046-c266914fe1b3/keystone-cron/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.561453 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5b46885bb9-hzjqn_d045398c-e0bb-47f7-b069-67c75ba5dab5/keystone-api/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.713883 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_f21f73ad-9726-44ae-a239-21adfeb80d1f/kube-state-metrics/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.741075 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.843648 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-zdmj5_e98c0b12-b750-4125-b3c6-170a91a0aa0e/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.896688 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content\") pod \"3782d74f-790d-47fb-8d34-27ef970b3231\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.896756 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities\") pod \"3782d74f-790d-47fb-8d34-27ef970b3231\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.897001 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hbxx\" (UniqueName: \"kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx\") pod \"3782d74f-790d-47fb-8d34-27ef970b3231\" (UID: \"3782d74f-790d-47fb-8d34-27ef970b3231\") " Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.897752 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities" (OuterVolumeSpecName: "utilities") pod "3782d74f-790d-47fb-8d34-27ef970b3231" (UID: "3782d74f-790d-47fb-8d34-27ef970b3231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.912915 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx" (OuterVolumeSpecName: "kube-api-access-5hbxx") pod "3782d74f-790d-47fb-8d34-27ef970b3231" (UID: "3782d74f-790d-47fb-8d34-27ef970b3231"). InnerVolumeSpecName "kube-api-access-5hbxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.973889 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3782d74f-790d-47fb-8d34-27ef970b3231" (UID: "3782d74f-790d-47fb-8d34-27ef970b3231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.998451 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hbxx\" (UniqueName: \"kubernetes.io/projected/3782d74f-790d-47fb-8d34-27ef970b3231-kube-api-access-5hbxx\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.998725 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:08 crc kubenswrapper[4795]: I0310 16:04:08.998810 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3782d74f-790d-47fb-8d34-27ef970b3231-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.167033 4795 generic.go:334] "Generic (PLEG): container finished" podID="3782d74f-790d-47fb-8d34-27ef970b3231" containerID="9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d" exitCode=0 Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.167399 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerDied","Data":"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d"} Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.167309 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-92b92" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.167457 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-92b92" event={"ID":"3782d74f-790d-47fb-8d34-27ef970b3231","Type":"ContainerDied","Data":"e4f6457fd4aa051c2d287203aa4e690040e6ca3bf42355a8e9a99415c2ba0ca7"} Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.167483 4795 scope.go:117] "RemoveContainer" containerID="9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.192352 4795 scope.go:117] "RemoveContainer" containerID="8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.217111 4795 scope.go:117] "RemoveContainer" containerID="d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.233405 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9f8b48dd7-fxv5t_a43b4908-0c2c-4d7c-8aff-1cd405684654/neutron-api/0.log" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.244875 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.254989 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-92b92"] Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.259875 4795 scope.go:117] "RemoveContainer" containerID="9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d" Mar 10 16:04:09 crc kubenswrapper[4795]: E0310 16:04:09.260390 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d\": container with ID starting with 9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d not found: ID does not exist" containerID="9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.260439 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d"} err="failed to get container status \"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d\": rpc error: code = NotFound desc = could not find container \"9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d\": container with ID starting with 9a24eb7931802ae69c6eb277c8f193bb198473b76f92c376d9b4c5a00c43886d not found: ID does not exist" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.260465 4795 scope.go:117] "RemoveContainer" containerID="8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd" Mar 10 16:04:09 crc kubenswrapper[4795]: E0310 16:04:09.260752 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd\": container with ID starting with 8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd not found: ID does not exist" containerID="8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.260786 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd"} err="failed to get container status \"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd\": rpc error: code = NotFound desc = could not find container \"8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd\": container with ID starting with 8a659befb717b201afbdde0c9255de33f8d0f69d645f1d46ad8a9533b128f2fd not found: ID does not exist" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.260811 4795 scope.go:117] "RemoveContainer" containerID="d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd" Mar 10 16:04:09 crc kubenswrapper[4795]: E0310 16:04:09.261324 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd\": container with ID starting with d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd not found: ID does not exist" containerID="d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.261347 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd"} err="failed to get container status \"d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd\": rpc error: code = NotFound desc = could not find container \"d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd\": container with ID starting with d47d3614f39807fea1ebd465a10c549625a68cb013e850b56a5cbbaff4c526cd not found: ID does not exist" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.289157 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-9f8b48dd7-fxv5t_a43b4908-0c2c-4d7c-8aff-1cd405684654/neutron-httpd/0.log" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.497147 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" path="/var/lib/kubelet/pods/3782d74f-790d-47fb-8d34-27ef970b3231/volumes" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.500797 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-jcm84_2cd0441d-482a-4db1-a831-4dfca1afb6f4/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.950892 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3097e989-78f5-453d-a495-fc8b4a805efb/nova-cell0-conductor-conductor/0.log" Mar 10 16:04:09 crc kubenswrapper[4795]: I0310 16:04:09.996453 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6edfb391-f92d-4a3c-9e60-e32038dc9f5e/nova-api-log/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.117865 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_6edfb391-f92d-4a3c-9e60-e32038dc9f5e/nova-api-api/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.206794 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7f3cf6cb-a222-4fe9-a40d-9e4cf686f58d/nova-cell1-conductor-conductor/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.311566 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_62640f9a-d168-4f85-83e0-90caea1b50d4/nova-cell1-novncproxy-novncproxy/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.471732 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-snhgt_c8b65855-20a1-4f05-9d95-89cc7a05baaa/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.639834 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9d34d9b-f1e9-420e-9b30-d99a9b30f33c/nova-metadata-log/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.902434 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_020512d0-7890-48cf-8ce2-f9d08feef2e6/nova-scheduler-scheduler/0.log" Mar 10 16:04:10 crc kubenswrapper[4795]: I0310 16:04:10.943243 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d/mysql-bootstrap/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.194744 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d/galera/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.208632 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cd0fbf0b-7154-4f2e-b5b8-896cdceb4d0d/mysql-bootstrap/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.414156 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b8e5711d-12e4-458f-a944-6b37aca4afa3/mysql-bootstrap/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.562566 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b9d34d9b-f1e9-420e-9b30-d99a9b30f33c/nova-metadata-metadata/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.571529 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b8e5711d-12e4-458f-a944-6b37aca4afa3/galera/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.607022 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_b8e5711d-12e4-458f-a944-6b37aca4afa3/mysql-bootstrap/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.789657 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-j7t9w_fbfac213-1d19-4c6f-b88b-e7513792f2a1/openstack-network-exporter/0.log" Mar 10 16:04:11 crc kubenswrapper[4795]: I0310 16:04:11.841477 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_64902640-6d88-46eb-98f0-475f8f976aaa/openstackclient/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.013144 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sblqq_70f8a962-ce88-4e90-91b3-5272104b9d18/ovsdb-server-init/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.177300 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sblqq_70f8a962-ce88-4e90-91b3-5272104b9d18/ovsdb-server-init/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.214897 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sblqq_70f8a962-ce88-4e90-91b3-5272104b9d18/ovs-vswitchd/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.224402 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-sblqq_70f8a962-ce88-4e90-91b3-5272104b9d18/ovsdb-server/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.428904 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-x9p5v_4823094a-6e7f-49da-9aa5-7d67b893896c/ovn-controller/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.743168 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-pf7c6_09d32c78-bc8e-480b-b6e0-bfd1d5eedf66/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.808259 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f8be8f3c-b4c1-41dc-99bf-3950c21ce504/openstack-network-exporter/0.log" Mar 10 16:04:12 crc kubenswrapper[4795]: I0310 16:04:12.963980 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f8be8f3c-b4c1-41dc-99bf-3950c21ce504/ovn-northd/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.043442 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d58ad85e-6f98-4ba8-97b9-656dda7a5b93/openstack-network-exporter/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.056420 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d58ad85e-6f98-4ba8-97b9-656dda7a5b93/ovsdbserver-nb/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.276783 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8572cc94-1e6e-406c-b57b-56167baa0a87/ovsdbserver-sb/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.318728 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_8572cc94-1e6e-406c-b57b-56167baa0a87/openstack-network-exporter/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.540415 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56fb779756-g2577_99b89308-d3a5-4f4d-ae50-92ae07fb6941/placement-api/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.584524 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_460e64c3-819a-4bfe-859f-ea0b5400be76/setup-container/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.662924 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56fb779756-g2577_99b89308-d3a5-4f4d-ae50-92ae07fb6941/placement-log/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.764370 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_460e64c3-819a-4bfe-859f-ea0b5400be76/setup-container/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.793845 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_460e64c3-819a-4bfe-859f-ea0b5400be76/rabbitmq/0.log" Mar 10 16:04:13 crc kubenswrapper[4795]: I0310 16:04:13.941258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b9cd90de-c39a-41a8-92cb-1f2dc799209f/setup-container/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.095295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b9cd90de-c39a-41a8-92cb-1f2dc799209f/rabbitmq/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.136746 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b9cd90de-c39a-41a8-92cb-1f2dc799209f/setup-container/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.186247 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-4kprx_6fc691de-4716-4cb9-9e09-f085b6bc7625/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.425244 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ms8kd_89beb105-18d9-49fa-9eda-afe0819518d9/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.444711 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-5nnbx_2d340909-34ac-4a94-89f5-c4759eb3374f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.646661 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-m6ln6_672817be-38e6-4a34-9ef5-5c73fed66fdb/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.675318 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-k2fn2_def1f0a0-23d4-496c-b75f-037e2666d444/ssh-known-hosts-edpm-deployment/0.log" Mar 10 16:04:14 crc kubenswrapper[4795]: I0310 16:04:14.913024 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6b868dfb95-x8b6r_62ae38d2-b7a6-4c50-8506-dc3c18a89fd1/proxy-server/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.019516 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6b868dfb95-x8b6r_62ae38d2-b7a6-4c50-8506-dc3c18a89fd1/proxy-httpd/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.119745 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-nwz6d_c3379cef-da13-42d0-80b5-1600bbde9f95/swift-ring-rebalance/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.203451 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/account-auditor/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.247019 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/account-reaper/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.378810 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/account-replicator/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.394663 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/account-server/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.412734 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:15 crc kubenswrapper[4795]: E0310 16:04:15.413387 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="extract-utilities" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413408 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="extract-utilities" Mar 10 16:04:15 crc kubenswrapper[4795]: E0310 16:04:15.413427 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="extract-content" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413434 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="extract-content" Mar 10 16:04:15 crc kubenswrapper[4795]: E0310 16:04:15.413451 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="420efd3a-8ad6-4262-821d-17a19ce0c8ae" containerName="oc" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413458 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="420efd3a-8ad6-4262-821d-17a19ce0c8ae" containerName="oc" Mar 10 16:04:15 crc kubenswrapper[4795]: E0310 16:04:15.413469 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="registry-server" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413475 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="registry-server" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413682 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="420efd3a-8ad6-4262-821d-17a19ce0c8ae" containerName="oc" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.413708 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="3782d74f-790d-47fb-8d34-27ef970b3231" containerName="registry-server" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.416396 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.418145 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.589930 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/container-auditor/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.609472 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/container-replicator/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.622276 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.622354 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v4gt\" (UniqueName: \"kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.622458 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.628762 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/container-updater/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.723622 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.724108 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.724252 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v4gt\" (UniqueName: \"kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.724593 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.725061 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.762380 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v4gt\" (UniqueName: \"kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt\") pod \"certified-operators-8cjnt\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.764922 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/container-server/0.log" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.770438 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:15 crc kubenswrapper[4795]: I0310 16:04:15.844482 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/object-auditor/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.041180 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/object-expirer/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.266963 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/object-server/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.308025 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/object-replicator/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.465514 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.526998 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/object-updater/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.701392 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/rsync/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.853754 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_612389f7-00cb-49cc-9daf-a5e451d6312f/swift-recon-cron/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.899634 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-zlgdl_91b3b9ad-6b02-4647-b3b6-1789e0237c73/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:16 crc kubenswrapper[4795]: I0310 16:04:16.995696 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_865ec795-fb93-4628-bdf5-5451ffbf2c0c/tempest-tests-tempest-tests-runner/0.log" Mar 10 16:04:17 crc kubenswrapper[4795]: I0310 16:04:17.089051 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_718d7d95-6ec8-4bb6-9014-f1a7bf22cdca/test-operator-logs-container/0.log" Mar 10 16:04:17 crc kubenswrapper[4795]: I0310 16:04:17.250635 4795 generic.go:334] "Generic (PLEG): container finished" podID="9560dc25-b942-4872-b040-2b09be7d3c15" containerID="9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924" exitCode=0 Mar 10 16:04:17 crc kubenswrapper[4795]: I0310 16:04:17.250761 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerDied","Data":"9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924"} Mar 10 16:04:17 crc kubenswrapper[4795]: I0310 16:04:17.250807 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerStarted","Data":"0da96a061037d602b9061d7e639937758d758cbec6ac444da986ace6f08ab15e"} Mar 10 16:04:17 crc kubenswrapper[4795]: I0310 16:04:17.307153 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-kdxs2_c877826d-0c20-4e0d-b4b9-e11b301c36d3/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 10 16:04:18 crc kubenswrapper[4795]: I0310 16:04:18.538711 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:04:18 crc kubenswrapper[4795]: I0310 16:04:18.539030 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:04:19 crc kubenswrapper[4795]: I0310 16:04:19.278075 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerStarted","Data":"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4"} Mar 10 16:04:21 crc kubenswrapper[4795]: I0310 16:04:21.295592 4795 generic.go:334] "Generic (PLEG): container finished" podID="9560dc25-b942-4872-b040-2b09be7d3c15" containerID="badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4" exitCode=0 Mar 10 16:04:21 crc kubenswrapper[4795]: I0310 16:04:21.295639 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerDied","Data":"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4"} Mar 10 16:04:22 crc kubenswrapper[4795]: I0310 16:04:22.310428 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerStarted","Data":"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df"} Mar 10 16:04:22 crc kubenswrapper[4795]: I0310 16:04:22.333616 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8cjnt" podStartSLOduration=2.899690451 podStartE2EDuration="7.33359544s" podCreationTimestamp="2026-03-10 16:04:15 +0000 UTC" firstStartedPulling="2026-03-10 16:04:17.252907545 +0000 UTC m=+3490.418648443" lastFinishedPulling="2026-03-10 16:04:21.686812534 +0000 UTC m=+3494.852553432" observedRunningTime="2026-03-10 16:04:22.330535632 +0000 UTC m=+3495.496276530" watchObservedRunningTime="2026-03-10 16:04:22.33359544 +0000 UTC m=+3495.499336338" Mar 10 16:04:23 crc kubenswrapper[4795]: I0310 16:04:23.664786 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_f3f06c2a-0098-46d6-96e5-6cbe9caf24ef/memcached/0.log" Mar 10 16:04:25 crc kubenswrapper[4795]: I0310 16:04:25.770533 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:25 crc kubenswrapper[4795]: I0310 16:04:25.771165 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:25 crc kubenswrapper[4795]: I0310 16:04:25.819751 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:26 crc kubenswrapper[4795]: I0310 16:04:26.382164 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:26 crc kubenswrapper[4795]: I0310 16:04:26.540362 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:28 crc kubenswrapper[4795]: I0310 16:04:28.366058 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8cjnt" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="registry-server" containerID="cri-o://2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df" gracePeriod=2 Mar 10 16:04:28 crc kubenswrapper[4795]: I0310 16:04:28.965527 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.087914 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities\") pod \"9560dc25-b942-4872-b040-2b09be7d3c15\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.088146 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content\") pod \"9560dc25-b942-4872-b040-2b09be7d3c15\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.088256 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v4gt\" (UniqueName: \"kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt\") pod \"9560dc25-b942-4872-b040-2b09be7d3c15\" (UID: \"9560dc25-b942-4872-b040-2b09be7d3c15\") " Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.088796 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities" (OuterVolumeSpecName: "utilities") pod "9560dc25-b942-4872-b040-2b09be7d3c15" (UID: "9560dc25-b942-4872-b040-2b09be7d3c15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.095383 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt" (OuterVolumeSpecName: "kube-api-access-9v4gt") pod "9560dc25-b942-4872-b040-2b09be7d3c15" (UID: "9560dc25-b942-4872-b040-2b09be7d3c15"). InnerVolumeSpecName "kube-api-access-9v4gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.191202 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v4gt\" (UniqueName: \"kubernetes.io/projected/9560dc25-b942-4872-b040-2b09be7d3c15-kube-api-access-9v4gt\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.191519 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.375614 4795 generic.go:334] "Generic (PLEG): container finished" podID="9560dc25-b942-4872-b040-2b09be7d3c15" containerID="2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df" exitCode=0 Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.375652 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerDied","Data":"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df"} Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.375677 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8cjnt" event={"ID":"9560dc25-b942-4872-b040-2b09be7d3c15","Type":"ContainerDied","Data":"0da96a061037d602b9061d7e639937758d758cbec6ac444da986ace6f08ab15e"} Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.375695 4795 scope.go:117] "RemoveContainer" containerID="2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.375809 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8cjnt" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.399526 4795 scope.go:117] "RemoveContainer" containerID="badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.413520 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9560dc25-b942-4872-b040-2b09be7d3c15" (UID: "9560dc25-b942-4872-b040-2b09be7d3c15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.421246 4795 scope.go:117] "RemoveContainer" containerID="9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.463921 4795 scope.go:117] "RemoveContainer" containerID="2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df" Mar 10 16:04:29 crc kubenswrapper[4795]: E0310 16:04:29.464539 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df\": container with ID starting with 2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df not found: ID does not exist" containerID="2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.464572 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df"} err="failed to get container status \"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df\": rpc error: code = NotFound desc = could not find container \"2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df\": container with ID starting with 2b402cef3185fb7d3e1f2c874a8030da84745384539c9ae0e2397587f10048df not found: ID does not exist" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.464601 4795 scope.go:117] "RemoveContainer" containerID="badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4" Mar 10 16:04:29 crc kubenswrapper[4795]: E0310 16:04:29.464769 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4\": container with ID starting with badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4 not found: ID does not exist" containerID="badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.464797 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4"} err="failed to get container status \"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4\": rpc error: code = NotFound desc = could not find container \"badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4\": container with ID starting with badda05d47fa48fd9c9cef666ab6778bafdba5dd0fe65398f047307e9649bad4 not found: ID does not exist" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.464811 4795 scope.go:117] "RemoveContainer" containerID="9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924" Mar 10 16:04:29 crc kubenswrapper[4795]: E0310 16:04:29.465080 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924\": container with ID starting with 9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924 not found: ID does not exist" containerID="9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.465141 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924"} err="failed to get container status \"9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924\": rpc error: code = NotFound desc = could not find container \"9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924\": container with ID starting with 9b42452cce523046249bb5f26c9b03fc4df69dfda71b1d151a9a67bbb06a3924 not found: ID does not exist" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.508487 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9560dc25-b942-4872-b040-2b09be7d3c15-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.696581 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:29 crc kubenswrapper[4795]: I0310 16:04:29.705994 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8cjnt"] Mar 10 16:04:31 crc kubenswrapper[4795]: I0310 16:04:31.497193 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" path="/var/lib/kubelet/pods/9560dc25-b942-4872-b040-2b09be7d3c15/volumes" Mar 10 16:04:42 crc kubenswrapper[4795]: I0310 16:04:42.239761 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-tlmlb_a8f94485-8cb5-43ab-b30d-2ad0b0a7836a/manager/0.log" Mar 10 16:04:42 crc kubenswrapper[4795]: I0310 16:04:42.469120 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/util/0.log" Mar 10 16:04:42 crc kubenswrapper[4795]: I0310 16:04:42.649758 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/pull/0.log" Mar 10 16:04:42 crc kubenswrapper[4795]: I0310 16:04:42.683682 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/util/0.log" Mar 10 16:04:42 crc kubenswrapper[4795]: I0310 16:04:42.893188 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/pull/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.060704 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/util/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.138982 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/pull/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.257406 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f0083203efe9e6342af359799c43f95bb072312d753963b287968ff1f2g5stz_d2ffa720-9d8d-48bb-ba13-87d605145e4e/extract/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.328988 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-8tpdc_69ee85e7-4d8f-493c-8480-6eefec2091ae/manager/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.586169 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-vxz75_0171e721-a233-4112-ac5b-503a1aef22eb/manager/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.607370 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-qhk6r_552bb17b-df18-40ea-8688-f5f5c16e7c5b/manager/0.log" Mar 10 16:04:43 crc kubenswrapper[4795]: I0310 16:04:43.793085 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-kcdqn_d8e67a8d-0858-4177-b9a0-fa1ba281424c/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.019190 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-h9tr6_d6c4189b-47e3-41a5-83b6-2e6673f8d595/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.373237 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-xhvcm_9968b76f-ff42-4d0e-9096-a229cf314dcb/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.380052 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-5995f4446f-4vlhk_ec0bfd76-f8c1-48a9-b35b-6307d31446e6/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.576859 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-8k7b2_85d0556d-e44b-4a30-a0f9-076e356bceef/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.725359 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-7vht5_1d435dd6-f95b-4883-928d-d010f897bb68/manager/0.log" Mar 10 16:04:44 crc kubenswrapper[4795]: I0310 16:04:44.857147 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-sz4bh_9aefc86f-37f5-4056-9e78-0eb01103e984/manager/0.log" Mar 10 16:04:45 crc kubenswrapper[4795]: I0310 16:04:45.181373 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-bjdwk_7ffe6f4f-ca5a-4b61-977c-1fcd22035674/manager/0.log" Mar 10 16:04:45 crc kubenswrapper[4795]: I0310 16:04:45.209295 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-pfmkh_d3b35266-d392-4d69-8ba2-471d69708706/manager/0.log" Mar 10 16:04:45 crc kubenswrapper[4795]: I0310 16:04:45.436339 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7d5rnb_d7d04047-d616-4c35-a6f3-7767688d4393/manager/0.log" Mar 10 16:04:45 crc kubenswrapper[4795]: I0310 16:04:45.851898 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7c7f7d994-987ck_cb672437-b689-4c5a-ba13-934f96350bbf/operator/0.log" Mar 10 16:04:46 crc kubenswrapper[4795]: I0310 16:04:46.057360 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-bpvzt_15382ecd-b669-452c-8cee-6abdc8828035/registry-server/0.log" Mar 10 16:04:46 crc kubenswrapper[4795]: I0310 16:04:46.416910 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-tglhw_b01360ed-92af-4626-a226-7cf86bdd51e1/manager/0.log" Mar 10 16:04:46 crc kubenswrapper[4795]: I0310 16:04:46.685818 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-kd79t_23620549-aa69-4e1b-bfb4-e335532a318c/manager/0.log" Mar 10 16:04:46 crc kubenswrapper[4795]: I0310 16:04:46.866771 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hkmpm_44fbebe4-6a17-4378-9f39-bda40adb7e02/operator/0.log" Mar 10 16:04:47 crc kubenswrapper[4795]: I0310 16:04:47.053258 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-677c674df7-qz5k7_7a87e459-bb41-405f-8fea-040c8a223373/manager/0.log" Mar 10 16:04:47 crc kubenswrapper[4795]: I0310 16:04:47.214899 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-76d6f6bb5f-26dcx_69a31d53-90dd-46ca-a5ee-8841b89445e6/manager/0.log" Mar 10 16:04:47 crc kubenswrapper[4795]: I0310 16:04:47.242142 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6cd66dbd4b-5xwgv_5242d539-a4e7-4a4c-b485-e8c43ce52546/manager/0.log" Mar 10 16:04:47 crc kubenswrapper[4795]: I0310 16:04:47.356595 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-zsx7m_a4e07d9c-566b-4d59-869a-2d3720455624/manager/0.log" Mar 10 16:04:47 crc kubenswrapper[4795]: I0310 16:04:47.472048 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6dd88c6f67-fl2gq_f23149ba-6bcc-49ac-93fc-60092174c5a8/manager/0.log" Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.395788 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-rgwvl_c5c14124-2fc6-4052-b12a-81336c47ae33/manager/0.log" Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.539358 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.539624 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.539781 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.540615 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:04:48 crc kubenswrapper[4795]: I0310 16:04:48.540790 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac" gracePeriod=600 Mar 10 16:04:49 crc kubenswrapper[4795]: I0310 16:04:49.362337 4795 scope.go:117] "RemoveContainer" containerID="ef92c24147795359f51faf18bc3dd9841f47cea48899249a5faba8d4a66279f2" Mar 10 16:04:49 crc kubenswrapper[4795]: I0310 16:04:49.545221 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac" exitCode=0 Mar 10 16:04:49 crc kubenswrapper[4795]: I0310 16:04:49.545270 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac"} Mar 10 16:04:49 crc kubenswrapper[4795]: I0310 16:04:49.545306 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerStarted","Data":"9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5"} Mar 10 16:04:49 crc kubenswrapper[4795]: I0310 16:04:49.545330 4795 scope.go:117] "RemoveContainer" containerID="33e085dca8b79d7309bc381f03c702e205dd9480883363b4abec8ef07ef03471" Mar 10 16:05:06 crc kubenswrapper[4795]: I0310 16:05:06.172488 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-sd26k_afe203dd-0a5b-44c3-afd1-fe0452f276bc/control-plane-machine-set-operator/0.log" Mar 10 16:05:06 crc kubenswrapper[4795]: I0310 16:05:06.364167 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7csjs_3f0f577c-af8a-414f-ad38-1d1d839d472f/kube-rbac-proxy/0.log" Mar 10 16:05:06 crc kubenswrapper[4795]: I0310 16:05:06.408280 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7csjs_3f0f577c-af8a-414f-ad38-1d1d839d472f/machine-api-operator/0.log" Mar 10 16:05:18 crc kubenswrapper[4795]: I0310 16:05:18.436598 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c987b_cedffdc5-be80-4b91-836a-261b0388fabd/cert-manager-controller/0.log" Mar 10 16:05:18 crc kubenswrapper[4795]: I0310 16:05:18.642381 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-6jbwd_c1e372af-e82c-4c9d-b29c-7428b5d7746f/cert-manager-cainjector/0.log" Mar 10 16:05:18 crc kubenswrapper[4795]: I0310 16:05:18.691809 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-gtsvs_f9920cd8-80d1-4f31-ad9d-fbc4a4b01f59/cert-manager-webhook/0.log" Mar 10 16:05:30 crc kubenswrapper[4795]: I0310 16:05:30.554840 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-5w9l5_73454f9e-adb1-4874-a046-8a74850d4667/nmstate-console-plugin/0.log" Mar 10 16:05:30 crc kubenswrapper[4795]: I0310 16:05:30.728534 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-wqm26_d4285f38-a8d4-4521-9f80-27346b23640e/nmstate-handler/0.log" Mar 10 16:05:30 crc kubenswrapper[4795]: I0310 16:05:30.834727 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-cj65x_30c516e5-88f8-4da7-a095-d56867635a94/nmstate-metrics/0.log" Mar 10 16:05:30 crc kubenswrapper[4795]: I0310 16:05:30.838570 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-cj65x_30c516e5-88f8-4da7-a095-d56867635a94/kube-rbac-proxy/0.log" Mar 10 16:05:30 crc kubenswrapper[4795]: I0310 16:05:30.993143 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-h2mf9_f1ae1d31-73b1-4619-91f2-39b6f1a8ad62/nmstate-operator/0.log" Mar 10 16:05:31 crc kubenswrapper[4795]: I0310 16:05:31.054579 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-k85t4_f7da0eaa-841e-4750-9018-2264cc0142ff/nmstate-webhook/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.000380 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-r22j4_1ca8c314-7be3-4435-bb1c-8a27d57e2f3d/kube-rbac-proxy/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.095994 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-r22j4_1ca8c314-7be3-4435-bb1c-8a27d57e2f3d/controller/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.192113 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-frr-files/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.417871 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-frr-files/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.435399 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-reloader/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.445868 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-reloader/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.472932 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-metrics/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.724272 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-reloader/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.727401 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-metrics/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.729604 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-metrics/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.759748 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-frr-files/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.931181 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-frr-files/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.931350 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-reloader/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.968626 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/controller/0.log" Mar 10 16:05:57 crc kubenswrapper[4795]: I0310 16:05:57.991083 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/cp-metrics/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.130926 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/frr-metrics/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.150581 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/kube-rbac-proxy/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.189482 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/kube-rbac-proxy-frr/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.328605 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/reloader/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.461417 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-hsznt_ecac241b-bada-4193-b7e0-e771dce28a24/frr-k8s-webhook-server/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.622260 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5585467d4f-8qzds_92eb5433-b34f-4b2b-bddf-ebe3de747f71/manager/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.844719 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-58c9c77f74-sds7h_313c3847-2713-4511-ad55-237f00ee0d8e/webhook-server/0.log" Mar 10 16:05:58 crc kubenswrapper[4795]: I0310 16:05:58.950494 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2tb54_2d1c83bb-1fd9-4500-95b5-ded04a953128/kube-rbac-proxy/0.log" Mar 10 16:05:59 crc kubenswrapper[4795]: I0310 16:05:59.532945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-2tb54_2d1c83bb-1fd9-4500-95b5-ded04a953128/speaker/0.log" Mar 10 16:05:59 crc kubenswrapper[4795]: I0310 16:05:59.656328 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l6l79_8a0a1463-1ecd-456c-9e61-7d954ebcbce4/frr/0.log" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.144254 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552646-sf9cw"] Mar 10 16:06:00 crc kubenswrapper[4795]: E0310 16:06:00.144653 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.144670 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4795]: E0310 16:06:00.144710 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="extract-content" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.144754 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="extract-content" Mar 10 16:06:00 crc kubenswrapper[4795]: E0310 16:06:00.144788 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="extract-utilities" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.144798 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="extract-utilities" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.145012 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="9560dc25-b942-4872-b040-2b09be7d3c15" containerName="registry-server" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.145656 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.165833 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-sf9cw"] Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.189008 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.189312 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.189466 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.226373 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8k4t\" (UniqueName: \"kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t\") pod \"auto-csr-approver-29552646-sf9cw\" (UID: \"4983243b-b679-483c-8433-45ca6deb1fde\") " pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.327923 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8k4t\" (UniqueName: \"kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t\") pod \"auto-csr-approver-29552646-sf9cw\" (UID: \"4983243b-b679-483c-8433-45ca6deb1fde\") " pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.352776 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8k4t\" (UniqueName: \"kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t\") pod \"auto-csr-approver-29552646-sf9cw\" (UID: \"4983243b-b679-483c-8433-45ca6deb1fde\") " pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:00 crc kubenswrapper[4795]: I0310 16:06:00.504465 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:01 crc kubenswrapper[4795]: I0310 16:06:01.029289 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-sf9cw"] Mar 10 16:06:01 crc kubenswrapper[4795]: I0310 16:06:01.045513 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:06:01 crc kubenswrapper[4795]: I0310 16:06:01.208632 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" event={"ID":"4983243b-b679-483c-8433-45ca6deb1fde","Type":"ContainerStarted","Data":"480ecbc08460569b29472f35aa31eee8a9346d81348805046568cd027ddce473"} Mar 10 16:06:03 crc kubenswrapper[4795]: I0310 16:06:03.228106 4795 generic.go:334] "Generic (PLEG): container finished" podID="4983243b-b679-483c-8433-45ca6deb1fde" containerID="5aed8f5531c95bce10384991debc398bead69095575f6209f749cd33a4434e1b" exitCode=0 Mar 10 16:06:03 crc kubenswrapper[4795]: I0310 16:06:03.228184 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" event={"ID":"4983243b-b679-483c-8433-45ca6deb1fde","Type":"ContainerDied","Data":"5aed8f5531c95bce10384991debc398bead69095575f6209f749cd33a4434e1b"} Mar 10 16:06:04 crc kubenswrapper[4795]: I0310 16:06:04.593983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:04 crc kubenswrapper[4795]: I0310 16:06:04.704568 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8k4t\" (UniqueName: \"kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t\") pod \"4983243b-b679-483c-8433-45ca6deb1fde\" (UID: \"4983243b-b679-483c-8433-45ca6deb1fde\") " Mar 10 16:06:04 crc kubenswrapper[4795]: I0310 16:06:04.713222 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t" (OuterVolumeSpecName: "kube-api-access-b8k4t") pod "4983243b-b679-483c-8433-45ca6deb1fde" (UID: "4983243b-b679-483c-8433-45ca6deb1fde"). InnerVolumeSpecName "kube-api-access-b8k4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:06:04 crc kubenswrapper[4795]: I0310 16:06:04.807602 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8k4t\" (UniqueName: \"kubernetes.io/projected/4983243b-b679-483c-8433-45ca6deb1fde-kube-api-access-b8k4t\") on node \"crc\" DevicePath \"\"" Mar 10 16:06:05 crc kubenswrapper[4795]: I0310 16:06:05.255990 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" event={"ID":"4983243b-b679-483c-8433-45ca6deb1fde","Type":"ContainerDied","Data":"480ecbc08460569b29472f35aa31eee8a9346d81348805046568cd027ddce473"} Mar 10 16:06:05 crc kubenswrapper[4795]: I0310 16:06:05.256320 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="480ecbc08460569b29472f35aa31eee8a9346d81348805046568cd027ddce473" Mar 10 16:06:05 crc kubenswrapper[4795]: I0310 16:06:05.256033 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552646-sf9cw" Mar 10 16:06:05 crc kubenswrapper[4795]: I0310 16:06:05.694188 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-bgmfb"] Mar 10 16:06:05 crc kubenswrapper[4795]: I0310 16:06:05.706007 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552640-bgmfb"] Mar 10 16:06:07 crc kubenswrapper[4795]: I0310 16:06:07.489752 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="909e2760-8a48-4c97-8a05-caa0e928ee7c" path="/var/lib/kubelet/pods/909e2760-8a48-4c97-8a05-caa0e928ee7c/volumes" Mar 10 16:06:12 crc kubenswrapper[4795]: I0310 16:06:12.659002 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/util/0.log" Mar 10 16:06:12 crc kubenswrapper[4795]: I0310 16:06:12.777442 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/util/0.log" Mar 10 16:06:12 crc kubenswrapper[4795]: I0310 16:06:12.907912 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/pull/0.log" Mar 10 16:06:12 crc kubenswrapper[4795]: I0310 16:06:12.911516 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/pull/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.116945 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/extract/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.142919 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/pull/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.149395 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82bwrjf_debdf9b8-7f1a-4d6a-be68-78d832d39089/util/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.323707 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-utilities/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.512103 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-utilities/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.563912 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-content/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.569537 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-content/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.685178 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-content/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.754036 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/extract-utilities/0.log" Mar 10 16:06:13 crc kubenswrapper[4795]: I0310 16:06:13.890237 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-utilities/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.116982 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-content/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.134487 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-utilities/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.140425 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-content/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.300427 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9zdl2_1c92a54b-5890-4e03-8c5a-c02f308fa42c/registry-server/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.409972 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-utilities/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.465967 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/extract-content/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.679561 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/util/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.918297 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qxkhv_2290378c-c594-45e6-9436-0bff75e32af9/registry-server/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.963277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/pull/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.978881 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/util/0.log" Mar 10 16:06:14 crc kubenswrapper[4795]: I0310 16:06:14.982471 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/pull/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.134147 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/util/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.142335 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/extract/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.142820 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f49w2hm_62a78361-debb-4191-af7d-24be60f6fe39/pull/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.588057 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-utilities/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.592193 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n48hs_e285d423-5d70-4e87-aed9-13bf768889ec/marketplace-operator/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.745047 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-content/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.785686 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-utilities/0.log" Mar 10 16:06:15 crc kubenswrapper[4795]: I0310 16:06:15.808285 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-content/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.046965 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-utilities/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.049967 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/extract-content/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.076472 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-pj98w_fb933310-8cd4-41f5-8a2e-2956f51956e1/registry-server/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.218007 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-utilities/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.376019 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-utilities/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.379176 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-content/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.380119 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-content/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.554020 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-utilities/0.log" Mar 10 16:06:16 crc kubenswrapper[4795]: I0310 16:06:16.557576 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/extract-content/0.log" Mar 10 16:06:17 crc kubenswrapper[4795]: I0310 16:06:17.038019 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vn6b2_dade6b65-b7ed-45c3-bf75-28e8d62d94c6/registry-server/0.log" Mar 10 16:06:48 crc kubenswrapper[4795]: I0310 16:06:48.539238 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:06:48 crc kubenswrapper[4795]: I0310 16:06:48.539872 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:06:49 crc kubenswrapper[4795]: I0310 16:06:49.509491 4795 scope.go:117] "RemoveContainer" containerID="b669ebe71e808658b91eb86b7a11019b79c48f37fcab8d4a209a74b50e68f8dd" Mar 10 16:07:18 crc kubenswrapper[4795]: I0310 16:07:18.539039 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:07:18 crc kubenswrapper[4795]: I0310 16:07:18.539514 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:07:48 crc kubenswrapper[4795]: I0310 16:07:48.539179 4795 patch_prober.go:28] interesting pod/machine-config-daemon-747vh container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 10 16:07:48 crc kubenswrapper[4795]: I0310 16:07:48.539843 4795 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 10 16:07:48 crc kubenswrapper[4795]: I0310 16:07:48.539917 4795 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-747vh" Mar 10 16:07:48 crc kubenswrapper[4795]: I0310 16:07:48.541060 4795 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5"} pod="openshift-machine-config-operator/machine-config-daemon-747vh" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 10 16:07:48 crc kubenswrapper[4795]: I0310 16:07:48.541205 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" containerName="machine-config-daemon" containerID="cri-o://9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" gracePeriod=600 Mar 10 16:07:48 crc kubenswrapper[4795]: E0310 16:07:48.670375 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:07:49 crc kubenswrapper[4795]: I0310 16:07:49.224761 4795 generic.go:334] "Generic (PLEG): container finished" podID="92ceb516-b88c-44bd-b534-25ea21b31379" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" exitCode=0 Mar 10 16:07:49 crc kubenswrapper[4795]: I0310 16:07:49.224897 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-747vh" event={"ID":"92ceb516-b88c-44bd-b534-25ea21b31379","Type":"ContainerDied","Data":"9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5"} Mar 10 16:07:49 crc kubenswrapper[4795]: I0310 16:07:49.225160 4795 scope.go:117] "RemoveContainer" containerID="f902561e08cc6ec569cf1763fcb9ca628dff6f707fa7c825cd56f8d025611eac" Mar 10 16:07:49 crc kubenswrapper[4795]: I0310 16:07:49.225740 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:07:49 crc kubenswrapper[4795]: E0310 16:07:49.226114 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:07:59 crc kubenswrapper[4795]: I0310 16:07:59.476497 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:07:59 crc kubenswrapper[4795]: E0310 16:07:59.477197 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.145241 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552648-xjkg2"] Mar 10 16:08:00 crc kubenswrapper[4795]: E0310 16:08:00.145636 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4983243b-b679-483c-8433-45ca6deb1fde" containerName="oc" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.145648 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="4983243b-b679-483c-8433-45ca6deb1fde" containerName="oc" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.145830 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="4983243b-b679-483c-8433-45ca6deb1fde" containerName="oc" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.146455 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.150220 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.150409 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.150557 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.161695 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-xjkg2"] Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.223486 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslws\" (UniqueName: \"kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws\") pod \"auto-csr-approver-29552648-xjkg2\" (UID: \"d3514ac9-dd0a-487c-8b10-d361e6465f94\") " pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.325366 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslws\" (UniqueName: \"kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws\") pod \"auto-csr-approver-29552648-xjkg2\" (UID: \"d3514ac9-dd0a-487c-8b10-d361e6465f94\") " pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.347715 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslws\" (UniqueName: \"kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws\") pod \"auto-csr-approver-29552648-xjkg2\" (UID: \"d3514ac9-dd0a-487c-8b10-d361e6465f94\") " pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.468082 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:00 crc kubenswrapper[4795]: I0310 16:08:00.931976 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552648-xjkg2"] Mar 10 16:08:01 crc kubenswrapper[4795]: I0310 16:08:01.356494 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" event={"ID":"d3514ac9-dd0a-487c-8b10-d361e6465f94","Type":"ContainerStarted","Data":"cf1fb7e72240c3e8cb81d9869220ecb06994ee5ea84a5f4834bacba088aff1c4"} Mar 10 16:08:02 crc kubenswrapper[4795]: I0310 16:08:02.373755 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" event={"ID":"d3514ac9-dd0a-487c-8b10-d361e6465f94","Type":"ContainerStarted","Data":"7ef84c09581f0588e7b3f4f38e2bcc940cdfc8695514914efd5e724d056698ab"} Mar 10 16:08:02 crc kubenswrapper[4795]: I0310 16:08:02.388177 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" podStartSLOduration=1.372119597 podStartE2EDuration="2.388155782s" podCreationTimestamp="2026-03-10 16:08:00 +0000 UTC" firstStartedPulling="2026-03-10 16:08:00.938245206 +0000 UTC m=+3714.103986104" lastFinishedPulling="2026-03-10 16:08:01.954281391 +0000 UTC m=+3715.120022289" observedRunningTime="2026-03-10 16:08:02.387143693 +0000 UTC m=+3715.552884611" watchObservedRunningTime="2026-03-10 16:08:02.388155782 +0000 UTC m=+3715.553896680" Mar 10 16:08:03 crc kubenswrapper[4795]: I0310 16:08:03.388656 4795 generic.go:334] "Generic (PLEG): container finished" podID="d3514ac9-dd0a-487c-8b10-d361e6465f94" containerID="7ef84c09581f0588e7b3f4f38e2bcc940cdfc8695514914efd5e724d056698ab" exitCode=0 Mar 10 16:08:03 crc kubenswrapper[4795]: I0310 16:08:03.388939 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" event={"ID":"d3514ac9-dd0a-487c-8b10-d361e6465f94","Type":"ContainerDied","Data":"7ef84c09581f0588e7b3f4f38e2bcc940cdfc8695514914efd5e724d056698ab"} Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.402256 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerID="9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3" exitCode=0 Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.402361 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" event={"ID":"7a1c9fc1-975f-44c7-95b4-a731abb4a742","Type":"ContainerDied","Data":"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3"} Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.402959 4795 scope.go:117] "RemoveContainer" containerID="9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3" Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.676459 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.825789 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sslws\" (UniqueName: \"kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws\") pod \"d3514ac9-dd0a-487c-8b10-d361e6465f94\" (UID: \"d3514ac9-dd0a-487c-8b10-d361e6465f94\") " Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.832721 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws" (OuterVolumeSpecName: "kube-api-access-sslws") pod "d3514ac9-dd0a-487c-8b10-d361e6465f94" (UID: "d3514ac9-dd0a-487c-8b10-d361e6465f94"). InnerVolumeSpecName "kube-api-access-sslws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.922539 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7sdzb_must-gather-9ghfg_7a1c9fc1-975f-44c7-95b4-a731abb4a742/gather/0.log" Mar 10 16:08:04 crc kubenswrapper[4795]: I0310 16:08:04.927411 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sslws\" (UniqueName: \"kubernetes.io/projected/d3514ac9-dd0a-487c-8b10-d361e6465f94-kube-api-access-sslws\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.418128 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" event={"ID":"d3514ac9-dd0a-487c-8b10-d361e6465f94","Type":"ContainerDied","Data":"cf1fb7e72240c3e8cb81d9869220ecb06994ee5ea84a5f4834bacba088aff1c4"} Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.418459 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf1fb7e72240c3e8cb81d9869220ecb06994ee5ea84a5f4834bacba088aff1c4" Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.418215 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552648-xjkg2" Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.458695 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-l6bbt"] Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.468741 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552642-l6bbt"] Mar 10 16:08:05 crc kubenswrapper[4795]: I0310 16:08:05.511770 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4780d693-bde7-430b-b1ed-4ff5c27f3b0a" path="/var/lib/kubelet/pods/4780d693-bde7-430b-b1ed-4ff5c27f3b0a/volumes" Mar 10 16:08:10 crc kubenswrapper[4795]: I0310 16:08:10.478234 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:08:10 crc kubenswrapper[4795]: E0310 16:08:10.479481 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:08:12 crc kubenswrapper[4795]: I0310 16:08:12.926377 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-7sdzb/must-gather-9ghfg"] Mar 10 16:08:12 crc kubenswrapper[4795]: I0310 16:08:12.928249 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="copy" containerID="cri-o://df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96" gracePeriod=2 Mar 10 16:08:12 crc kubenswrapper[4795]: I0310 16:08:12.938933 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-7sdzb/must-gather-9ghfg"] Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.321277 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7sdzb_must-gather-9ghfg_7a1c9fc1-975f-44c7-95b4-a731abb4a742/copy/0.log" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.321987 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.494224 4795 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-7sdzb_must-gather-9ghfg_7a1c9fc1-975f-44c7-95b4-a731abb4a742/copy/0.log" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.494944 4795 generic.go:334] "Generic (PLEG): container finished" podID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerID="df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96" exitCode=143 Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.494983 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7sdzb/must-gather-9ghfg" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.495001 4795 scope.go:117] "RemoveContainer" containerID="df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.512794 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output\") pod \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.513076 4795 scope.go:117] "RemoveContainer" containerID="9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.514117 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5v8p\" (UniqueName: \"kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p\") pod \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\" (UID: \"7a1c9fc1-975f-44c7-95b4-a731abb4a742\") " Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.527747 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p" (OuterVolumeSpecName: "kube-api-access-g5v8p") pod "7a1c9fc1-975f-44c7-95b4-a731abb4a742" (UID: "7a1c9fc1-975f-44c7-95b4-a731abb4a742"). InnerVolumeSpecName "kube-api-access-g5v8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.568969 4795 scope.go:117] "RemoveContainer" containerID="df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96" Mar 10 16:08:13 crc kubenswrapper[4795]: E0310 16:08:13.571278 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96\": container with ID starting with df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96 not found: ID does not exist" containerID="df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.571327 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96"} err="failed to get container status \"df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96\": rpc error: code = NotFound desc = could not find container \"df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96\": container with ID starting with df602996ba01a52565bbed72f22d0944d0b22c0cb20ec0c18040976f63ee1b96 not found: ID does not exist" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.571354 4795 scope.go:117] "RemoveContainer" containerID="9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3" Mar 10 16:08:13 crc kubenswrapper[4795]: E0310 16:08:13.571863 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3\": container with ID starting with 9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3 not found: ID does not exist" containerID="9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.571887 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3"} err="failed to get container status \"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3\": rpc error: code = NotFound desc = could not find container \"9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3\": container with ID starting with 9d1fb34ca68d6b85cf8507c432b1006e69549b1dfe5a0d6c169ab37a2d3ab0d3 not found: ID does not exist" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.617037 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5v8p\" (UniqueName: \"kubernetes.io/projected/7a1c9fc1-975f-44c7-95b4-a731abb4a742-kube-api-access-g5v8p\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.733438 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7a1c9fc1-975f-44c7-95b4-a731abb4a742" (UID: "7a1c9fc1-975f-44c7-95b4-a731abb4a742"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:08:13 crc kubenswrapper[4795]: I0310 16:08:13.821952 4795 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a1c9fc1-975f-44c7-95b4-a731abb4a742-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 10 16:08:15 crc kubenswrapper[4795]: I0310 16:08:15.492133 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" path="/var/lib/kubelet/pods/7a1c9fc1-975f-44c7-95b4-a731abb4a742/volumes" Mar 10 16:08:21 crc kubenswrapper[4795]: I0310 16:08:21.477380 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:08:21 crc kubenswrapper[4795]: E0310 16:08:21.478060 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:08:35 crc kubenswrapper[4795]: I0310 16:08:35.476954 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:08:35 crc kubenswrapper[4795]: E0310 16:08:35.477860 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:08:49 crc kubenswrapper[4795]: I0310 16:08:49.476338 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:08:49 crc kubenswrapper[4795]: E0310 16:08:49.476982 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:08:49 crc kubenswrapper[4795]: I0310 16:08:49.624813 4795 scope.go:117] "RemoveContainer" containerID="ec16fb270fe3a7b7e0b1d3efac6801254a3ab63a09b53ce2ef79eac76979f684" Mar 10 16:09:00 crc kubenswrapper[4795]: I0310 16:09:00.476557 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:00 crc kubenswrapper[4795]: E0310 16:09:00.477767 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:09:11 crc kubenswrapper[4795]: I0310 16:09:11.476776 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:11 crc kubenswrapper[4795]: E0310 16:09:11.477533 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:09:22 crc kubenswrapper[4795]: I0310 16:09:22.476529 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:22 crc kubenswrapper[4795]: E0310 16:09:22.477587 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:09:33 crc kubenswrapper[4795]: I0310 16:09:33.476688 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:33 crc kubenswrapper[4795]: E0310 16:09:33.477404 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:09:46 crc kubenswrapper[4795]: I0310 16:09:46.478378 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:46 crc kubenswrapper[4795]: E0310 16:09:46.479230 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:09:59 crc kubenswrapper[4795]: I0310 16:09:59.478242 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:09:59 crc kubenswrapper[4795]: E0310 16:09:59.480241 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.154176 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552650-lg6gs"] Mar 10 16:10:00 crc kubenswrapper[4795]: E0310 16:10:00.154984 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3514ac9-dd0a-487c-8b10-d361e6465f94" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155005 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3514ac9-dd0a-487c-8b10-d361e6465f94" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4795]: E0310 16:10:00.155036 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="copy" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155044 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="copy" Mar 10 16:10:00 crc kubenswrapper[4795]: E0310 16:10:00.155086 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="gather" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155095 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="gather" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155339 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3514ac9-dd0a-487c-8b10-d361e6465f94" containerName="oc" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155380 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="gather" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.155394 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a1c9fc1-975f-44c7-95b4-a731abb4a742" containerName="copy" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.156341 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.158744 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.158788 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.159737 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.172547 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-lg6gs"] Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.266928 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlpjg\" (UniqueName: \"kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg\") pod \"auto-csr-approver-29552650-lg6gs\" (UID: \"8b0f1cc9-7e13-47e1-abaa-58875d4560bf\") " pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.368522 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlpjg\" (UniqueName: \"kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg\") pod \"auto-csr-approver-29552650-lg6gs\" (UID: \"8b0f1cc9-7e13-47e1-abaa-58875d4560bf\") " pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.390548 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlpjg\" (UniqueName: \"kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg\") pod \"auto-csr-approver-29552650-lg6gs\" (UID: \"8b0f1cc9-7e13-47e1-abaa-58875d4560bf\") " pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.479086 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:00 crc kubenswrapper[4795]: I0310 16:10:00.925626 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552650-lg6gs"] Mar 10 16:10:01 crc kubenswrapper[4795]: I0310 16:10:01.558544 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" event={"ID":"8b0f1cc9-7e13-47e1-abaa-58875d4560bf","Type":"ContainerStarted","Data":"74ef51d14fc3455c42af5d56617d6a71b0c34415a87caa361c72804d743c892c"} Mar 10 16:10:02 crc kubenswrapper[4795]: I0310 16:10:02.568484 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" event={"ID":"8b0f1cc9-7e13-47e1-abaa-58875d4560bf","Type":"ContainerStarted","Data":"770a6beb6fee9a70a4e0995fc690a1326c26ad2cde3eb181abc944456d16056d"} Mar 10 16:10:02 crc kubenswrapper[4795]: I0310 16:10:02.591393 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" podStartSLOduration=1.3336743260000001 podStartE2EDuration="2.591370415s" podCreationTimestamp="2026-03-10 16:10:00 +0000 UTC" firstStartedPulling="2026-03-10 16:10:00.925826076 +0000 UTC m=+3834.091566974" lastFinishedPulling="2026-03-10 16:10:02.183522155 +0000 UTC m=+3835.349263063" observedRunningTime="2026-03-10 16:10:02.579875027 +0000 UTC m=+3835.745615925" watchObservedRunningTime="2026-03-10 16:10:02.591370415 +0000 UTC m=+3835.757111313" Mar 10 16:10:03 crc kubenswrapper[4795]: I0310 16:10:03.598383 4795 generic.go:334] "Generic (PLEG): container finished" podID="8b0f1cc9-7e13-47e1-abaa-58875d4560bf" containerID="770a6beb6fee9a70a4e0995fc690a1326c26ad2cde3eb181abc944456d16056d" exitCode=0 Mar 10 16:10:03 crc kubenswrapper[4795]: I0310 16:10:03.598487 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" event={"ID":"8b0f1cc9-7e13-47e1-abaa-58875d4560bf","Type":"ContainerDied","Data":"770a6beb6fee9a70a4e0995fc690a1326c26ad2cde3eb181abc944456d16056d"} Mar 10 16:10:04 crc kubenswrapper[4795]: I0310 16:10:04.954322 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:04 crc kubenswrapper[4795]: I0310 16:10:04.963567 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlpjg\" (UniqueName: \"kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg\") pod \"8b0f1cc9-7e13-47e1-abaa-58875d4560bf\" (UID: \"8b0f1cc9-7e13-47e1-abaa-58875d4560bf\") " Mar 10 16:10:04 crc kubenswrapper[4795]: I0310 16:10:04.969659 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg" (OuterVolumeSpecName: "kube-api-access-nlpjg") pod "8b0f1cc9-7e13-47e1-abaa-58875d4560bf" (UID: "8b0f1cc9-7e13-47e1-abaa-58875d4560bf"). InnerVolumeSpecName "kube-api-access-nlpjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.064999 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlpjg\" (UniqueName: \"kubernetes.io/projected/8b0f1cc9-7e13-47e1-abaa-58875d4560bf-kube-api-access-nlpjg\") on node \"crc\" DevicePath \"\"" Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.620843 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" event={"ID":"8b0f1cc9-7e13-47e1-abaa-58875d4560bf","Type":"ContainerDied","Data":"74ef51d14fc3455c42af5d56617d6a71b0c34415a87caa361c72804d743c892c"} Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.621240 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74ef51d14fc3455c42af5d56617d6a71b0c34415a87caa361c72804d743c892c" Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.620912 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552650-lg6gs" Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.652043 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-g6vgh"] Mar 10 16:10:05 crc kubenswrapper[4795]: I0310 16:10:05.659935 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552644-g6vgh"] Mar 10 16:10:07 crc kubenswrapper[4795]: I0310 16:10:07.491298 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="420efd3a-8ad6-4262-821d-17a19ce0c8ae" path="/var/lib/kubelet/pods/420efd3a-8ad6-4262-821d-17a19ce0c8ae/volumes" Mar 10 16:10:11 crc kubenswrapper[4795]: I0310 16:10:11.477004 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:10:11 crc kubenswrapper[4795]: E0310 16:10:11.477785 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:10:23 crc kubenswrapper[4795]: I0310 16:10:23.476362 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:10:23 crc kubenswrapper[4795]: E0310 16:10:23.477229 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:10:35 crc kubenswrapper[4795]: I0310 16:10:35.477164 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:10:35 crc kubenswrapper[4795]: E0310 16:10:35.477857 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:10:46 crc kubenswrapper[4795]: I0310 16:10:46.476765 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:10:46 crc kubenswrapper[4795]: E0310 16:10:46.477547 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:10:49 crc kubenswrapper[4795]: I0310 16:10:49.741800 4795 scope.go:117] "RemoveContainer" containerID="a680b6bb33ff0415a2633ce8f9989cf2d313a1644cb9c31b808e6f996eafaf87" Mar 10 16:10:57 crc kubenswrapper[4795]: I0310 16:10:57.487331 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:10:57 crc kubenswrapper[4795]: E0310 16:10:57.491008 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:11:11 crc kubenswrapper[4795]: I0310 16:11:11.477116 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:11:11 crc kubenswrapper[4795]: E0310 16:11:11.477837 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:11:24 crc kubenswrapper[4795]: I0310 16:11:24.476015 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:11:24 crc kubenswrapper[4795]: E0310 16:11:24.477026 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.595084 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:33 crc kubenswrapper[4795]: E0310 16:11:33.595988 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b0f1cc9-7e13-47e1-abaa-58875d4560bf" containerName="oc" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.596002 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b0f1cc9-7e13-47e1-abaa-58875d4560bf" containerName="oc" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.596261 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b0f1cc9-7e13-47e1-abaa-58875d4560bf" containerName="oc" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.597760 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.617192 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.790848 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9mx4\" (UniqueName: \"kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.791164 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.791295 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.892734 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9mx4\" (UniqueName: \"kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.892820 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.892892 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.893301 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.893338 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:33 crc kubenswrapper[4795]: I0310 16:11:33.923984 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9mx4\" (UniqueName: \"kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4\") pod \"redhat-operators-rtj8f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:34 crc kubenswrapper[4795]: I0310 16:11:34.219652 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:34 crc kubenswrapper[4795]: I0310 16:11:34.793994 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:35 crc kubenswrapper[4795]: I0310 16:11:35.433408 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerID="de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2" exitCode=0 Mar 10 16:11:35 crc kubenswrapper[4795]: I0310 16:11:35.433490 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerDied","Data":"de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2"} Mar 10 16:11:35 crc kubenswrapper[4795]: I0310 16:11:35.433767 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerStarted","Data":"47d6621db8658653256abcf833fdc67dc0f719388954b322214e96ead03b348d"} Mar 10 16:11:35 crc kubenswrapper[4795]: I0310 16:11:35.435211 4795 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 10 16:11:37 crc kubenswrapper[4795]: I0310 16:11:37.460518 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerID="98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193" exitCode=0 Mar 10 16:11:37 crc kubenswrapper[4795]: I0310 16:11:37.460626 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerDied","Data":"98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193"} Mar 10 16:11:37 crc kubenswrapper[4795]: I0310 16:11:37.477085 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:11:37 crc kubenswrapper[4795]: E0310 16:11:37.477430 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:11:38 crc kubenswrapper[4795]: I0310 16:11:38.474152 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerStarted","Data":"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085"} Mar 10 16:11:38 crc kubenswrapper[4795]: I0310 16:11:38.502452 4795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rtj8f" podStartSLOduration=3.082850558 podStartE2EDuration="5.502434309s" podCreationTimestamp="2026-03-10 16:11:33 +0000 UTC" firstStartedPulling="2026-03-10 16:11:35.434998593 +0000 UTC m=+3928.600739491" lastFinishedPulling="2026-03-10 16:11:37.854582314 +0000 UTC m=+3931.020323242" observedRunningTime="2026-03-10 16:11:38.496423518 +0000 UTC m=+3931.662164456" watchObservedRunningTime="2026-03-10 16:11:38.502434309 +0000 UTC m=+3931.668175207" Mar 10 16:11:44 crc kubenswrapper[4795]: I0310 16:11:44.220312 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:44 crc kubenswrapper[4795]: I0310 16:11:44.220917 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:44 crc kubenswrapper[4795]: I0310 16:11:44.267490 4795 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:44 crc kubenswrapper[4795]: I0310 16:11:44.564882 4795 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:44 crc kubenswrapper[4795]: I0310 16:11:44.608950 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:46 crc kubenswrapper[4795]: I0310 16:11:46.578423 4795 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rtj8f" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="registry-server" containerID="cri-o://b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085" gracePeriod=2 Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.013875 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.212153 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content\") pod \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.212232 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities\") pod \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.212373 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9mx4\" (UniqueName: \"kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4\") pod \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\" (UID: \"7e7373a7-7dd7-448d-9096-ad2bf1bc115f\") " Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.212979 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities" (OuterVolumeSpecName: "utilities") pod "7e7373a7-7dd7-448d-9096-ad2bf1bc115f" (UID: "7e7373a7-7dd7-448d-9096-ad2bf1bc115f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.217191 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4" (OuterVolumeSpecName: "kube-api-access-p9mx4") pod "7e7373a7-7dd7-448d-9096-ad2bf1bc115f" (UID: "7e7373a7-7dd7-448d-9096-ad2bf1bc115f"). InnerVolumeSpecName "kube-api-access-p9mx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.314504 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9mx4\" (UniqueName: \"kubernetes.io/projected/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-kube-api-access-p9mx4\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.314541 4795 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-utilities\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.589688 4795 generic.go:334] "Generic (PLEG): container finished" podID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerID="b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085" exitCode=0 Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.589726 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerDied","Data":"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085"} Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.589752 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rtj8f" event={"ID":"7e7373a7-7dd7-448d-9096-ad2bf1bc115f","Type":"ContainerDied","Data":"47d6621db8658653256abcf833fdc67dc0f719388954b322214e96ead03b348d"} Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.589768 4795 scope.go:117] "RemoveContainer" containerID="b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.589777 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rtj8f" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.607530 4795 scope.go:117] "RemoveContainer" containerID="98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.628502 4795 scope.go:117] "RemoveContainer" containerID="de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.667794 4795 scope.go:117] "RemoveContainer" containerID="b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085" Mar 10 16:11:47 crc kubenswrapper[4795]: E0310 16:11:47.668162 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085\": container with ID starting with b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085 not found: ID does not exist" containerID="b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.668198 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085"} err="failed to get container status \"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085\": rpc error: code = NotFound desc = could not find container \"b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085\": container with ID starting with b7c2a698f5860f62983aced3c517058536195e5620af54ff5158e16001f16085 not found: ID does not exist" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.668224 4795 scope.go:117] "RemoveContainer" containerID="98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193" Mar 10 16:11:47 crc kubenswrapper[4795]: E0310 16:11:47.668455 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193\": container with ID starting with 98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193 not found: ID does not exist" containerID="98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.668482 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193"} err="failed to get container status \"98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193\": rpc error: code = NotFound desc = could not find container \"98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193\": container with ID starting with 98fbcee452907ccd35ff0c523aba0cad8f1d92263740b7e00a9677e65b1c7193 not found: ID does not exist" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.668499 4795 scope.go:117] "RemoveContainer" containerID="de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2" Mar 10 16:11:47 crc kubenswrapper[4795]: E0310 16:11:47.668705 4795 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2\": container with ID starting with de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2 not found: ID does not exist" containerID="de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2" Mar 10 16:11:47 crc kubenswrapper[4795]: I0310 16:11:47.668730 4795 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2"} err="failed to get container status \"de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2\": rpc error: code = NotFound desc = could not find container \"de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2\": container with ID starting with de8aba0cca428749ad2c39283d19c3df408c4a4f0c1da47f38aad38d2deb8ea2 not found: ID does not exist" Mar 10 16:11:49 crc kubenswrapper[4795]: I0310 16:11:49.061027 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e7373a7-7dd7-448d-9096-ad2bf1bc115f" (UID: "7e7373a7-7dd7-448d-9096-ad2bf1bc115f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 10 16:11:49 crc kubenswrapper[4795]: I0310 16:11:49.131271 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:49 crc kubenswrapper[4795]: I0310 16:11:49.139493 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rtj8f"] Mar 10 16:11:49 crc kubenswrapper[4795]: I0310 16:11:49.149675 4795 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e7373a7-7dd7-448d-9096-ad2bf1bc115f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 10 16:11:49 crc kubenswrapper[4795]: I0310 16:11:49.554183 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" path="/var/lib/kubelet/pods/7e7373a7-7dd7-448d-9096-ad2bf1bc115f/volumes" Mar 10 16:11:52 crc kubenswrapper[4795]: I0310 16:11:52.476778 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:11:52 crc kubenswrapper[4795]: E0310 16:11:52.477576 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.166377 4795 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29552652-z6psj"] Mar 10 16:12:00 crc kubenswrapper[4795]: E0310 16:12:00.167272 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="extract-content" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.167320 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="extract-content" Mar 10 16:12:00 crc kubenswrapper[4795]: E0310 16:12:00.167332 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="registry-server" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.167338 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="registry-server" Mar 10 16:12:00 crc kubenswrapper[4795]: E0310 16:12:00.167355 4795 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="extract-utilities" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.167361 4795 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="extract-utilities" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.167541 4795 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e7373a7-7dd7-448d-9096-ad2bf1bc115f" containerName="registry-server" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.168245 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.172401 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.172548 4795 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rhls5" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.173918 4795 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.199517 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-z6psj"] Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.205370 4795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4n6f\" (UniqueName: \"kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f\") pod \"auto-csr-approver-29552652-z6psj\" (UID: \"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a\") " pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.308236 4795 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4n6f\" (UniqueName: \"kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f\") pod \"auto-csr-approver-29552652-z6psj\" (UID: \"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a\") " pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.328738 4795 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4n6f\" (UniqueName: \"kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f\") pod \"auto-csr-approver-29552652-z6psj\" (UID: \"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a\") " pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.528824 4795 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:00 crc kubenswrapper[4795]: I0310 16:12:00.968586 4795 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29552652-z6psj"] Mar 10 16:12:01 crc kubenswrapper[4795]: I0310 16:12:01.725196 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-z6psj" event={"ID":"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a","Type":"ContainerStarted","Data":"3851bd966da5464ce0342ccd7de9d0964b4b092f3bfd3b6e5dac624c4794e1ac"} Mar 10 16:12:02 crc kubenswrapper[4795]: I0310 16:12:02.734349 4795 generic.go:334] "Generic (PLEG): container finished" podID="ce9747b4-d66f-4bfb-9230-f7f0f7f7130a" containerID="68f63a64f1b9b5c07671080d56c9a6c117f4bddcb9d67f80cec02ddaa64907b5" exitCode=0 Mar 10 16:12:02 crc kubenswrapper[4795]: I0310 16:12:02.734408 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-z6psj" event={"ID":"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a","Type":"ContainerDied","Data":"68f63a64f1b9b5c07671080d56c9a6c117f4bddcb9d67f80cec02ddaa64907b5"} Mar 10 16:12:03 crc kubenswrapper[4795]: I0310 16:12:03.476679 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:12:03 crc kubenswrapper[4795]: E0310 16:12:03.476956 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.078577 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.234978 4795 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4n6f\" (UniqueName: \"kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f\") pod \"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a\" (UID: \"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a\") " Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.240853 4795 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f" (OuterVolumeSpecName: "kube-api-access-q4n6f") pod "ce9747b4-d66f-4bfb-9230-f7f0f7f7130a" (UID: "ce9747b4-d66f-4bfb-9230-f7f0f7f7130a"). InnerVolumeSpecName "kube-api-access-q4n6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.337457 4795 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4n6f\" (UniqueName: \"kubernetes.io/projected/ce9747b4-d66f-4bfb-9230-f7f0f7f7130a-kube-api-access-q4n6f\") on node \"crc\" DevicePath \"\"" Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.760917 4795 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29552652-z6psj" event={"ID":"ce9747b4-d66f-4bfb-9230-f7f0f7f7130a","Type":"ContainerDied","Data":"3851bd966da5464ce0342ccd7de9d0964b4b092f3bfd3b6e5dac624c4794e1ac"} Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.761004 4795 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3851bd966da5464ce0342ccd7de9d0964b4b092f3bfd3b6e5dac624c4794e1ac" Mar 10 16:12:04 crc kubenswrapper[4795]: I0310 16:12:04.761031 4795 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29552652-z6psj" Mar 10 16:12:05 crc kubenswrapper[4795]: I0310 16:12:05.154429 4795 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-sf9cw"] Mar 10 16:12:05 crc kubenswrapper[4795]: I0310 16:12:05.165606 4795 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29552646-sf9cw"] Mar 10 16:12:05 crc kubenswrapper[4795]: I0310 16:12:05.492542 4795 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4983243b-b679-483c-8433-45ca6deb1fde" path="/var/lib/kubelet/pods/4983243b-b679-483c-8433-45ca6deb1fde/volumes" Mar 10 16:12:16 crc kubenswrapper[4795]: I0310 16:12:16.476228 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:12:16 crc kubenswrapper[4795]: E0310 16:12:16.476903 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:12:30 crc kubenswrapper[4795]: I0310 16:12:30.477509 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:12:30 crc kubenswrapper[4795]: E0310 16:12:30.479055 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:12:43 crc kubenswrapper[4795]: I0310 16:12:43.477518 4795 scope.go:117] "RemoveContainer" containerID="9ab83502e41241af5cffefbdb4c979baa41ff4685391e40990f246b3dd1fb2d5" Mar 10 16:12:43 crc kubenswrapper[4795]: E0310 16:12:43.478724 4795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-747vh_openshift-machine-config-operator(92ceb516-b88c-44bd-b534-25ea21b31379)\"" pod="openshift-machine-config-operator/machine-config-daemon-747vh" podUID="92ceb516-b88c-44bd-b534-25ea21b31379" Mar 10 16:12:49 crc kubenswrapper[4795]: I0310 16:12:49.841195 4795 scope.go:117] "RemoveContainer" containerID="5aed8f5531c95bce10384991debc398bead69095575f6209f749cd33a4434e1b"